22736 1727204234.54279: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-twx executable location = /usr/local/bin/ansible-playbook python version = 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 22736 1727204234.54878: Added group all to inventory 22736 1727204234.54881: Added group ungrouped to inventory 22736 1727204234.54887: Group all now contains ungrouped 22736 1727204234.54892: Examining possible inventory source: /tmp/network-6Zh/inventory-Sfc.yml 22736 1727204234.85329: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 22736 1727204234.85524: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 22736 1727204234.85554: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 22736 1727204234.85743: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 22736 1727204234.85973: Loaded config def from plugin (inventory/script) 22736 1727204234.85976: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 22736 1727204234.86100: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 22736 1727204234.86281: Loaded config def from plugin (inventory/yaml) 22736 1727204234.86284: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 22736 1727204234.86515: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 22736 1727204234.87721: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 22736 1727204234.87725: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 22736 1727204234.87729: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 22736 1727204234.87736: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 22736 1727204234.87856: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 22736 1727204234.88030: /tmp/network-6Zh/inventory-Sfc.yml was not parsable by auto 22736 1727204234.88234: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 22736 1727204234.88280: Loading data from /tmp/network-6Zh/inventory-Sfc.yml 22736 1727204234.88572: group all already in inventory 22736 1727204234.88580: set inventory_file for managed-node1 22736 1727204234.88585: set inventory_dir for managed-node1 22736 1727204234.88586: Added host managed-node1 to inventory 22736 1727204234.88591: Added host managed-node1 to group all 22736 1727204234.88592: set ansible_host for managed-node1 22736 1727204234.88593: set ansible_ssh_extra_args for managed-node1 22736 1727204234.88598: set inventory_file for managed-node2 22736 1727204234.88601: set inventory_dir for managed-node2 22736 1727204234.88603: Added host managed-node2 to inventory 22736 1727204234.88605: Added host managed-node2 to group all 22736 1727204234.88606: set ansible_host for managed-node2 22736 1727204234.88607: set ansible_ssh_extra_args for managed-node2 22736 1727204234.88645: set inventory_file for managed-node3 22736 1727204234.88649: set inventory_dir for managed-node3 22736 1727204234.88650: Added host managed-node3 to inventory 22736 1727204234.88652: Added host managed-node3 to group all 22736 1727204234.88653: set ansible_host for managed-node3 22736 1727204234.88654: set ansible_ssh_extra_args for managed-node3 22736 1727204234.88658: Reconcile groups and hosts in inventory. 22736 1727204234.88663: Group ungrouped now contains managed-node1 22736 1727204234.88666: Group ungrouped now contains managed-node2 22736 1727204234.88668: Group ungrouped now contains managed-node3 22736 1727204234.88771: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 22736 1727204234.89124: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 22736 1727204234.89318: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 22736 1727204234.89356: Loaded config def from plugin (vars/host_group_vars) 22736 1727204234.89359: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 22736 1727204234.89367: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 22736 1727204234.89493: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 22736 1727204234.89554: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 22736 1727204234.90381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204234.90510: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 22736 1727204234.90570: Loaded config def from plugin (connection/local) 22736 1727204234.90574: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 22736 1727204234.91569: Loaded config def from plugin (connection/paramiko_ssh) 22736 1727204234.91573: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 22736 1727204234.92896: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 22736 1727204234.92954: Loaded config def from plugin (connection/psrp) 22736 1727204234.92958: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 22736 1727204234.94090: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 22736 1727204234.94154: Loaded config def from plugin (connection/ssh) 22736 1727204234.94158: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 22736 1727204234.96966: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 22736 1727204234.97024: Loaded config def from plugin (connection/winrm) 22736 1727204234.97033: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 22736 1727204234.97072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 22736 1727204234.97170: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 22736 1727204234.97276: Loaded config def from plugin (shell/cmd) 22736 1727204234.97278: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 22736 1727204234.97314: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 22736 1727204234.97422: Loaded config def from plugin (shell/powershell) 22736 1727204234.97424: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 22736 1727204234.97496: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 22736 1727204234.97759: Loaded config def from plugin (shell/sh) 22736 1727204234.97762: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 22736 1727204234.97811: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 22736 1727204234.97992: Loaded config def from plugin (become/runas) 22736 1727204234.97994: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 22736 1727204234.98274: Loaded config def from plugin (become/su) 22736 1727204234.98277: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 22736 1727204234.98515: Loaded config def from plugin (become/sudo) 22736 1727204234.98518: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 22736 1727204234.98565: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 22736 1727204234.99095: in VariableManager get_vars() 22736 1727204234.99124: done with get_vars() 22736 1727204234.99303: trying /usr/local/lib/python3.12/site-packages/ansible/modules 22736 1727204235.03913: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 22736 1727204235.04071: in VariableManager get_vars() 22736 1727204235.04077: done with get_vars() 22736 1727204235.04081: variable 'playbook_dir' from source: magic vars 22736 1727204235.04082: variable 'ansible_playbook_python' from source: magic vars 22736 1727204235.04083: variable 'ansible_config_file' from source: magic vars 22736 1727204235.04084: variable 'groups' from source: magic vars 22736 1727204235.04084: variable 'omit' from source: magic vars 22736 1727204235.04085: variable 'ansible_version' from source: magic vars 22736 1727204235.04086: variable 'ansible_check_mode' from source: magic vars 22736 1727204235.04087: variable 'ansible_diff_mode' from source: magic vars 22736 1727204235.04088: variable 'ansible_forks' from source: magic vars 22736 1727204235.04093: variable 'ansible_inventory_sources' from source: magic vars 22736 1727204235.04094: variable 'ansible_skip_tags' from source: magic vars 22736 1727204235.04095: variable 'ansible_limit' from source: magic vars 22736 1727204235.04096: variable 'ansible_run_tags' from source: magic vars 22736 1727204235.04097: variable 'ansible_verbosity' from source: magic vars 22736 1727204235.04149: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml 22736 1727204235.05111: in VariableManager get_vars() 22736 1727204235.05133: done with get_vars() 22736 1727204235.05184: in VariableManager get_vars() 22736 1727204235.05217: done with get_vars() 22736 1727204235.05264: in VariableManager get_vars() 22736 1727204235.05279: done with get_vars() 22736 1727204235.05327: in VariableManager get_vars() 22736 1727204235.05341: done with get_vars() 22736 1727204235.05445: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 22736 1727204235.05754: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 22736 1727204235.05956: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 22736 1727204235.06940: in VariableManager get_vars() 22736 1727204235.06971: done with get_vars() 22736 1727204235.07539: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 22736 1727204235.07748: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22736 1727204235.09511: in VariableManager get_vars() 22736 1727204235.09535: done with get_vars() 22736 1727204235.09746: in VariableManager get_vars() 22736 1727204235.09751: done with get_vars() 22736 1727204235.09754: variable 'playbook_dir' from source: magic vars 22736 1727204235.09755: variable 'ansible_playbook_python' from source: magic vars 22736 1727204235.09756: variable 'ansible_config_file' from source: magic vars 22736 1727204235.09757: variable 'groups' from source: magic vars 22736 1727204235.09758: variable 'omit' from source: magic vars 22736 1727204235.09759: variable 'ansible_version' from source: magic vars 22736 1727204235.09760: variable 'ansible_check_mode' from source: magic vars 22736 1727204235.09761: variable 'ansible_diff_mode' from source: magic vars 22736 1727204235.09762: variable 'ansible_forks' from source: magic vars 22736 1727204235.09763: variable 'ansible_inventory_sources' from source: magic vars 22736 1727204235.09764: variable 'ansible_skip_tags' from source: magic vars 22736 1727204235.09765: variable 'ansible_limit' from source: magic vars 22736 1727204235.09766: variable 'ansible_run_tags' from source: magic vars 22736 1727204235.09767: variable 'ansible_verbosity' from source: magic vars 22736 1727204235.09814: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml 22736 1727204235.09914: in VariableManager get_vars() 22736 1727204235.09919: done with get_vars() 22736 1727204235.09921: variable 'playbook_dir' from source: magic vars 22736 1727204235.09922: variable 'ansible_playbook_python' from source: magic vars 22736 1727204235.09923: variable 'ansible_config_file' from source: magic vars 22736 1727204235.09924: variable 'groups' from source: magic vars 22736 1727204235.09930: variable 'omit' from source: magic vars 22736 1727204235.09931: variable 'ansible_version' from source: magic vars 22736 1727204235.09932: variable 'ansible_check_mode' from source: magic vars 22736 1727204235.09933: variable 'ansible_diff_mode' from source: magic vars 22736 1727204235.09934: variable 'ansible_forks' from source: magic vars 22736 1727204235.09935: variable 'ansible_inventory_sources' from source: magic vars 22736 1727204235.09936: variable 'ansible_skip_tags' from source: magic vars 22736 1727204235.09937: variable 'ansible_limit' from source: magic vars 22736 1727204235.09938: variable 'ansible_run_tags' from source: magic vars 22736 1727204235.09939: variable 'ansible_verbosity' from source: magic vars 22736 1727204235.09981: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml 22736 1727204235.10096: in VariableManager get_vars() 22736 1727204235.10110: done with get_vars() 22736 1727204235.10169: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 22736 1727204235.10333: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 22736 1727204235.10448: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 22736 1727204235.11130: in VariableManager get_vars() 22736 1727204235.11156: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22736 1727204235.13271: in VariableManager get_vars() 22736 1727204235.13297: done with get_vars() 22736 1727204235.13350: in VariableManager get_vars() 22736 1727204235.13354: done with get_vars() 22736 1727204235.13357: variable 'playbook_dir' from source: magic vars 22736 1727204235.13358: variable 'ansible_playbook_python' from source: magic vars 22736 1727204235.13359: variable 'ansible_config_file' from source: magic vars 22736 1727204235.13360: variable 'groups' from source: magic vars 22736 1727204235.13361: variable 'omit' from source: magic vars 22736 1727204235.13362: variable 'ansible_version' from source: magic vars 22736 1727204235.13363: variable 'ansible_check_mode' from source: magic vars 22736 1727204235.13364: variable 'ansible_diff_mode' from source: magic vars 22736 1727204235.13365: variable 'ansible_forks' from source: magic vars 22736 1727204235.13366: variable 'ansible_inventory_sources' from source: magic vars 22736 1727204235.13367: variable 'ansible_skip_tags' from source: magic vars 22736 1727204235.13368: variable 'ansible_limit' from source: magic vars 22736 1727204235.13369: variable 'ansible_run_tags' from source: magic vars 22736 1727204235.13370: variable 'ansible_verbosity' from source: magic vars 22736 1727204235.13414: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml 22736 1727204235.13515: in VariableManager get_vars() 22736 1727204235.13537: done with get_vars() 22736 1727204235.13595: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 22736 1727204235.15429: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 22736 1727204235.15546: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 22736 1727204235.16131: in VariableManager get_vars() 22736 1727204235.16156: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22736 1727204235.18296: in VariableManager get_vars() 22736 1727204235.18314: done with get_vars() 22736 1727204235.18364: in VariableManager get_vars() 22736 1727204235.18378: done with get_vars() 22736 1727204235.18462: in VariableManager get_vars() 22736 1727204235.18477: done with get_vars() 22736 1727204235.18607: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 22736 1727204235.18625: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 22736 1727204235.18921: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 22736 1727204235.19163: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 22736 1727204235.19167: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) 22736 1727204235.19212: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 22736 1727204235.19247: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 22736 1727204235.19507: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 22736 1727204235.19601: Loaded config def from plugin (callback/default) 22736 1727204235.19605: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 22736 1727204235.21140: Loaded config def from plugin (callback/junit) 22736 1727204235.21145: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 22736 1727204235.21209: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 22736 1727204235.21308: Loaded config def from plugin (callback/minimal) 22736 1727204235.21311: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 22736 1727204235.21360: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) 22736 1727204235.21448: Loaded config def from plugin (callback/tree) 22736 1727204235.21451: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 22736 1727204235.21629: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 22736 1727204235.21632: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-twx/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_ethernet_nm.yml ************************************************ 10 plays in /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml 22736 1727204235.21664: in VariableManager get_vars() 22736 1727204235.21682: done with get_vars() 22736 1727204235.21691: in VariableManager get_vars() 22736 1727204235.21707: done with get_vars() 22736 1727204235.21713: variable 'omit' from source: magic vars 22736 1727204235.21763: in VariableManager get_vars() 22736 1727204235.21780: done with get_vars() 22736 1727204235.21813: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_ethernet.yml' with nm as provider] ********* 22736 1727204235.22506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 22736 1727204235.22599: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 22736 1727204235.22633: getting the remaining hosts for this loop 22736 1727204235.22635: done getting the remaining hosts for this loop 22736 1727204235.22639: getting the next task for host managed-node2 22736 1727204235.22643: done getting next task for host managed-node2 22736 1727204235.22646: ^ task is: TASK: Gathering Facts 22736 1727204235.22648: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204235.22655: getting variables 22736 1727204235.22657: in VariableManager get_vars() 22736 1727204235.22668: Calling all_inventory to load vars for managed-node2 22736 1727204235.22671: Calling groups_inventory to load vars for managed-node2 22736 1727204235.22674: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204235.22699: Calling all_plugins_play to load vars for managed-node2 22736 1727204235.22714: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204235.22719: Calling groups_plugins_play to load vars for managed-node2 22736 1727204235.22762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204235.22840: done with get_vars() 22736 1727204235.22848: done getting variables 22736 1727204235.22936: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Tuesday 24 September 2024 14:57:15 -0400 (0:00:00.014) 0:00:00.014 ***** 22736 1727204235.22961: entering _queue_task() for managed-node2/gather_facts 22736 1727204235.22962: Creating lock for gather_facts 22736 1727204235.23404: worker is 1 (out of 1 available) 22736 1727204235.23419: exiting _queue_task() for managed-node2/gather_facts 22736 1727204235.23434: done queuing things up, now waiting for results queue to drain 22736 1727204235.23436: waiting for pending results... 22736 1727204235.23809: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22736 1727204235.23818: in run() - task 12b410aa-8751-4f4a-548a-00000000007c 22736 1727204235.23821: variable 'ansible_search_path' from source: unknown 22736 1727204235.23824: calling self._execute() 22736 1727204235.23887: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204235.23905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204235.23923: variable 'omit' from source: magic vars 22736 1727204235.24052: variable 'omit' from source: magic vars 22736 1727204235.24091: variable 'omit' from source: magic vars 22736 1727204235.24146: variable 'omit' from source: magic vars 22736 1727204235.24199: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204235.24259: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204235.24284: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204235.24315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204235.24339: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204235.24377: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204235.24386: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204235.24396: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204235.24532: Set connection var ansible_timeout to 10 22736 1727204235.24557: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204235.24573: Set connection var ansible_shell_executable to /bin/sh 22736 1727204235.24580: Set connection var ansible_shell_type to sh 22736 1727204235.24593: Set connection var ansible_pipelining to False 22736 1727204235.24601: Set connection var ansible_connection to ssh 22736 1727204235.24633: variable 'ansible_shell_executable' from source: unknown 22736 1727204235.24641: variable 'ansible_connection' from source: unknown 22736 1727204235.24648: variable 'ansible_module_compression' from source: unknown 22736 1727204235.24660: variable 'ansible_shell_type' from source: unknown 22736 1727204235.24668: variable 'ansible_shell_executable' from source: unknown 22736 1727204235.24677: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204235.24685: variable 'ansible_pipelining' from source: unknown 22736 1727204235.24695: variable 'ansible_timeout' from source: unknown 22736 1727204235.24768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204235.24931: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204235.24949: variable 'omit' from source: magic vars 22736 1727204235.24958: starting attempt loop 22736 1727204235.24966: running the handler 22736 1727204235.24995: variable 'ansible_facts' from source: unknown 22736 1727204235.25023: _low_level_execute_command(): starting 22736 1727204235.25035: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204235.25830: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204235.25899: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204235.25974: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204235.25996: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204235.26020: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204235.26072: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204235.27857: stdout chunk (state=3): >>>/root <<< 22736 1727204235.28067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204235.28071: stdout chunk (state=3): >>><<< 22736 1727204235.28074: stderr chunk (state=3): >>><<< 22736 1727204235.28212: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204235.28216: _low_level_execute_command(): starting 22736 1727204235.28220: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204235.2810752-22806-7188534212795 `" && echo ansible-tmp-1727204235.2810752-22806-7188534212795="` echo /root/.ansible/tmp/ansible-tmp-1727204235.2810752-22806-7188534212795 `" ) && sleep 0' 22736 1727204235.28822: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204235.28839: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204235.28856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204235.28879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204235.28908: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204235.29023: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204235.29056: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204235.29137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204235.31227: stdout chunk (state=3): >>>ansible-tmp-1727204235.2810752-22806-7188534212795=/root/.ansible/tmp/ansible-tmp-1727204235.2810752-22806-7188534212795 <<< 22736 1727204235.31444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204235.31448: stdout chunk (state=3): >>><<< 22736 1727204235.31450: stderr chunk (state=3): >>><<< 22736 1727204235.31550: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204235.2810752-22806-7188534212795=/root/.ansible/tmp/ansible-tmp-1727204235.2810752-22806-7188534212795 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204235.31554: variable 'ansible_module_compression' from source: unknown 22736 1727204235.31599: ANSIBALLZ: Using generic lock for ansible.legacy.setup 22736 1727204235.31607: ANSIBALLZ: Acquiring lock 22736 1727204235.31615: ANSIBALLZ: Lock acquired: 140553536881728 22736 1727204235.31625: ANSIBALLZ: Creating module 22736 1727204235.94021: ANSIBALLZ: Writing module into payload 22736 1727204235.94419: ANSIBALLZ: Writing module 22736 1727204235.94594: ANSIBALLZ: Renaming module 22736 1727204235.94598: ANSIBALLZ: Done creating module 22736 1727204235.94680: variable 'ansible_facts' from source: unknown 22736 1727204235.94699: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204235.94713: _low_level_execute_command(): starting 22736 1727204235.94723: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 22736 1727204235.96160: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204235.96165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204235.96207: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204235.96344: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204235.96443: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204235.96461: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204235.98394: stdout chunk (state=3): >>>PLATFORM <<< 22736 1727204235.98398: stdout chunk (state=3): >>>Linux <<< 22736 1727204235.98401: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 <<< 22736 1727204235.98404: stdout chunk (state=3): >>>/usr/bin/python3 ENDFOUND <<< 22736 1727204235.98504: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204235.98763: stderr chunk (state=3): >>><<< 22736 1727204235.98767: stdout chunk (state=3): >>><<< 22736 1727204235.98792: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204235.98815 [managed-node2]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 22736 1727204235.98868: _low_level_execute_command(): starting 22736 1727204235.98873: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 22736 1727204235.99222: Sending initial data 22736 1727204235.99232: Sent initial data (1181 bytes) 22736 1727204236.00406: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204236.00461: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204236.00482: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204236.00498: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204236.00705: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204236.04495: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} <<< 22736 1727204236.05006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204236.05019: stdout chunk (state=3): >>><<< 22736 1727204236.05398: stderr chunk (state=3): >>><<< 22736 1727204236.05402: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"Fedora Linux\"\nVERSION=\"39 (Thirty Nine)\"\nID=fedora\nVERSION_ID=39\nVERSION_CODENAME=\"\"\nPLATFORM_ID=\"platform:f39\"\nPRETTY_NAME=\"Fedora Linux 39 (Thirty Nine)\"\nANSI_COLOR=\"0;38;2;60;110;180\"\nLOGO=fedora-logo-icon\nCPE_NAME=\"cpe:/o:fedoraproject:fedora:39\"\nDEFAULT_HOSTNAME=\"fedora\"\nHOME_URL=\"https://fedoraproject.org/\"\nDOCUMENTATION_URL=\"https://docs.fedoraproject.org/en-US/fedora/f39/system-administrators-guide/\"\nSUPPORT_URL=\"https://ask.fedoraproject.org/\"\nBUG_REPORT_URL=\"https://bugzilla.redhat.com/\"\nREDHAT_BUGZILLA_PRODUCT=\"Fedora\"\nREDHAT_BUGZILLA_PRODUCT_VERSION=39\nREDHAT_SUPPORT_PRODUCT=\"Fedora\"\nREDHAT_SUPPORT_PRODUCT_VERSION=39\nSUPPORT_END=2024-11-12\n"} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204236.05405: variable 'ansible_facts' from source: unknown 22736 1727204236.05408: variable 'ansible_facts' from source: unknown 22736 1727204236.05410: variable 'ansible_module_compression' from source: unknown 22736 1727204236.05454: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22736 1727204236.05492: variable 'ansible_facts' from source: unknown 22736 1727204236.05686: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204235.2810752-22806-7188534212795/AnsiballZ_setup.py 22736 1727204236.05970: Sending initial data 22736 1727204236.05980: Sent initial data (152 bytes) 22736 1727204236.06598: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204236.06613: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204236.06696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204236.06715: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204236.06732: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204236.07010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204236.08600: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204236.08625: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204236.08671: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmps1x6zxlz /root/.ansible/tmp/ansible-tmp-1727204235.2810752-22806-7188534212795/AnsiballZ_setup.py <<< 22736 1727204236.08682: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204235.2810752-22806-7188534212795/AnsiballZ_setup.py" <<< 22736 1727204236.08819: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmps1x6zxlz" to remote "/root/.ansible/tmp/ansible-tmp-1727204235.2810752-22806-7188534212795/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204235.2810752-22806-7188534212795/AnsiballZ_setup.py" <<< 22736 1727204236.15917: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204236.15931: stdout chunk (state=3): >>><<< 22736 1727204236.16013: stderr chunk (state=3): >>><<< 22736 1727204236.16227: done transferring module to remote 22736 1727204236.16231: _low_level_execute_command(): starting 22736 1727204236.16234: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204235.2810752-22806-7188534212795/ /root/.ansible/tmp/ansible-tmp-1727204235.2810752-22806-7188534212795/AnsiballZ_setup.py && sleep 0' 22736 1727204236.17977: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204236.18413: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204236.18448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204236.18463: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204236.18750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204236.20932: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204236.21029: stderr chunk (state=3): >>><<< 22736 1727204236.21040: stdout chunk (state=3): >>><<< 22736 1727204236.21126: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204236.21224: _low_level_execute_command(): starting 22736 1727204236.21228: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204235.2810752-22806-7188534212795/AnsiballZ_setup.py && sleep 0' 22736 1727204236.22549: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204236.22554: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204236.22766: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204236.22770: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204236.23034: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204236.23059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204236.23412: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204236.25863: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # <<< 22736 1727204236.25883: stdout chunk (state=3): >>>import 'marshal' # <<< 22736 1727204236.25951: stdout chunk (state=3): >>>import 'posix' # <<< 22736 1727204236.26010: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 22736 1727204236.26019: stdout chunk (state=3): >>> # installing zipimport hook<<< 22736 1727204236.26056: stdout chunk (state=3): >>> import 'time' # <<< 22736 1727204236.26170: stdout chunk (state=3): >>>import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 22736 1727204236.26292: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204236.26296: stdout chunk (state=3): >>>import '_codecs' # <<< 22736 1727204236.26298: stdout chunk (state=3): >>> <<< 22736 1727204236.26301: stdout chunk (state=3): >>>import 'codecs' # <<< 22736 1727204236.26406: stdout chunk (state=3): >>> <<< 22736 1727204236.26433: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788a00c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889fdbad0><<< 22736 1727204236.26460: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 22736 1727204236.26517: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788a00ea20> import '_signal' # <<< 22736 1727204236.26617: stdout chunk (state=3): >>> <<< 22736 1727204236.26620: stdout chunk (state=3): >>>import '_abc' # <<< 22736 1727204236.26623: stdout chunk (state=3): >>> <<< 22736 1727204236.26640: stdout chunk (state=3): >>>import 'abc' # import 'io' # <<< 22736 1727204236.26745: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 22736 1727204236.26855: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # <<< 22736 1727204236.26881: stdout chunk (state=3): >>>import 'posixpath' # <<< 22736 1727204236.26884: stdout chunk (state=3): >>> <<< 22736 1727204236.26982: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 22736 1727204236.26987: stdout chunk (state=3): >>>Processing user site-packages<<< 22736 1727204236.26993: stdout chunk (state=3): >>> Processing global site-packages<<< 22736 1727204236.27074: stdout chunk (state=3): >>> Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 22736 1727204236.27078: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 22736 1727204236.27111: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 22736 1727204236.27149: stdout chunk (state=3): >>> import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e210a0> <<< 22736 1727204236.27243: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 22736 1727204236.27294: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e21fd0> <<< 22736 1727204236.27400: stdout chunk (state=3): >>>import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux <<< 22736 1727204236.27403: stdout chunk (state=3): >>>Type "help", "copyright", "credits" or "license" for more information. <<< 22736 1727204236.28043: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py<<< 22736 1727204236.28094: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc'<<< 22736 1727204236.28097: stdout chunk (state=3): >>> <<< 22736 1727204236.28145: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py<<< 22736 1727204236.28269: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc'<<< 22736 1727204236.28273: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py<<< 22736 1727204236.28275: stdout chunk (state=3): >>> <<< 22736 1727204236.28358: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc'<<< 22736 1727204236.28410: stdout chunk (state=3): >>> import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e5fe00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 22736 1727204236.28478: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e5fec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py<<< 22736 1727204236.28487: stdout chunk (state=3): >>> <<< 22736 1727204236.28575: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 22736 1727204236.28642: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204236.28723: stdout chunk (state=3): >>>import 'itertools' # <<< 22736 1727204236.28727: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc'<<< 22736 1727204236.28796: stdout chunk (state=3): >>> import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e97830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' <<< 22736 1727204236.28905: stdout chunk (state=3): >>>import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e97ec0> import '_collections' # <<< 22736 1727204236.28975: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e77ad0> import '_functools' # <<< 22736 1727204236.28979: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e751f0><<< 22736 1727204236.29137: stdout chunk (state=3): >>> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e5cfb0> <<< 22736 1727204236.29228: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 22736 1727204236.29261: stdout chunk (state=3): >>>import '_sre' # <<< 22736 1727204236.29264: stdout chunk (state=3): >>> <<< 22736 1727204236.29293: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 22736 1727204236.29337: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 22736 1727204236.29374: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py <<< 22736 1727204236.29409: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 22736 1727204236.29521: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889ebb770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eba390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py<<< 22736 1727204236.29579: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e77e30><<< 22736 1727204236.29582: stdout chunk (state=3): >>> <<< 22736 1727204236.29775: stdout chunk (state=3): >>>import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eb8c50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc'<<< 22736 1727204236.29778: stdout chunk (state=3): >>> <<< 22736 1727204236.29781: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eec710><<< 22736 1727204236.29784: stdout chunk (state=3): >>> <<< 22736 1727204236.29786: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e5c230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc'<<< 22736 1727204236.29915: stdout chunk (state=3): >>> # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889eecbc0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eeca70> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889eece60><<< 22736 1727204236.29935: stdout chunk (state=3): >>> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e5ad50> <<< 22736 1727204236.29986: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc'<<< 22736 1727204236.30227: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eed520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eed220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eee420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc'<<< 22736 1727204236.30322: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' <<< 22736 1727204236.30326: stdout chunk (state=3): >>>import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889f08650> import 'errno' # <<< 22736 1727204236.30514: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889f09d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889f0ac90><<< 22736 1727204236.30569: stdout chunk (state=3): >>> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204236.30711: stdout chunk (state=3): >>># extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889f0b2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889f0a1e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 22736 1727204236.30748: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204236.30770: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889f0bd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889f0b4a0><<< 22736 1727204236.30782: stdout chunk (state=3): >>> <<< 22736 1727204236.30866: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eee480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py<<< 22736 1727204236.30951: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 22736 1727204236.30954: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 22736 1727204236.31050: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204236.31095: stdout chunk (state=3): >>> import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889c5fc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc'<<< 22736 1727204236.31136: stdout chunk (state=3): >>> # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204236.31285: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889c8c650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889c8c3b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204236.31290: stdout chunk (state=3): >>>import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889c8c5c0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204236.31305: stdout chunk (state=3): >>># extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889c8c830><<< 22736 1727204236.31339: stdout chunk (state=3): >>> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889c5de20> <<< 22736 1727204236.31374: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py<<< 22736 1727204236.31478: stdout chunk (state=3): >>> <<< 22736 1727204236.31592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 22736 1727204236.31615: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 22736 1727204236.31741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889c8df40> <<< 22736 1727204236.31744: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889c8cbc0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eee600> <<< 22736 1727204236.31782: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 22736 1727204236.31860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204236.31909: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 22736 1727204236.31950: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 22736 1727204236.32069: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889cba270> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 22736 1727204236.32274: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889cd23c0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 22736 1727204236.32507: stdout chunk (state=3): >>>import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889d0b170> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 22736 1727204236.32535: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889d31910> <<< 22736 1727204236.32627: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889d0b290> <<< 22736 1727204236.32656: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889cd3050> <<< 22736 1727204236.32847: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889d08980> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889cd1400> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889c8ee10> <<< 22736 1727204236.32966: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7889b08590> <<< 22736 1727204236.33106: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_v6r4i079/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 22736 1727204236.33329: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 22736 1727204236.33439: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 22736 1727204236.33508: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889b6e030> import '_typing' # <<< 22736 1727204236.33840: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889b44f20> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889b0bfb0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 22736 1727204236.35721: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.38151: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889b47e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889ba19a0> <<< 22736 1727204236.38510: stdout chunk (state=3): >>>import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889ba1730> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889ba1040> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889ba1a60> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889f0b230> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889ba26f0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889ba2930> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # <<< 22736 1727204236.38629: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889ba2e70> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a00b30> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a02750> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a03110> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a03f50> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 22736 1727204236.38649: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a06de0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a06f00> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a050a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 22736 1727204236.38655: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' <<< 22736 1727204236.38683: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' <<< 22736 1727204236.38719: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 22736 1727204236.38746: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 22736 1727204236.38771: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 22736 1727204236.38805: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a0ad50> <<< 22736 1727204236.38810: stdout chunk (state=3): >>>import '_tokenize' # <<< 22736 1727204236.38891: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a09820> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a09580> <<< 22736 1727204236.38940: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 22736 1727204236.39010: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a0b980> <<< 22736 1727204236.39043: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a055b0> <<< 22736 1727204236.39069: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a4ef90> <<< 22736 1727204236.39122: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 22736 1727204236.39165: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a4f0e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 22736 1727204236.39168: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 22736 1727204236.39236: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a54c80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a54a70> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 22736 1727204236.39351: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 22736 1727204236.39408: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a571a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a552e0> <<< 22736 1727204236.39442: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 22736 1727204236.39485: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204236.39514: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 22736 1727204236.39541: stdout chunk (state=3): >>>import '_string' # <<< 22736 1727204236.39571: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a5e9c0> <<< 22736 1727204236.39851: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a57350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a5fc50> <<< 22736 1727204236.39854: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a5faa0> <<< 22736 1727204236.39919: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a5fc80> <<< 22736 1727204236.39970: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a4f3e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 22736 1727204236.39973: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 22736 1727204236.40005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 22736 1727204236.40052: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204236.40108: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a634a0> <<< 22736 1727204236.40419: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a646b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a61c10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a62f90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a617f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 22736 1727204236.40549: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.40716: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204236.40719: stdout chunk (state=3): >>> <<< 22736 1727204236.40750: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204236.40774: stdout chunk (state=3): >>> import 'ansible.module_utils.common' # <<< 22736 1727204236.40801: stdout chunk (state=3): >>> # zipimport: zlib available<<< 22736 1727204236.40824: stdout chunk (state=3): >>> <<< 22736 1727204236.40895: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 22736 1727204236.41135: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204236.41398: stdout chunk (state=3): >>> # zipimport: zlib available<<< 22736 1727204236.41430: stdout chunk (state=3): >>> <<< 22736 1727204236.42648: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204236.42652: stdout chunk (state=3): >>> <<< 22736 1727204236.43566: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 22736 1727204236.43584: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204236.43665: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78898e86b0> <<< 22736 1727204236.43908: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898e94c0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889ba2840> <<< 22736 1727204236.43914: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 22736 1727204236.44005: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 22736 1727204236.44384: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.44624: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898e9490> # zipimport: zlib available <<< 22736 1727204236.45506: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.46544: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.46608: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.46762: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # <<< 22736 1727204236.46869: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.46872: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.46902: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 22736 1727204236.46922: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.47055: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.47406: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available <<< 22736 1727204236.47446: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 22736 1727204236.47454: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.47930: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.48433: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 22736 1727204236.48545: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 22736 1727204236.48563: stdout chunk (state=3): >>>import '_ast' # <<< 22736 1727204236.48716: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898ebef0> <<< 22736 1727204236.48738: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.48855: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.49014: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 22736 1727204236.49054: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # <<< 22736 1727204236.49058: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 22736 1727204236.49085: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 22736 1727204236.49181: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204236.49623: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78898f1fa0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78898f2900> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898ead80> # zipimport: zlib available # zipimport: zlib available <<< 22736 1727204236.49641: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available <<< 22736 1727204236.49824: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22736 1727204236.49856: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.49979: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 22736 1727204236.50053: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204236.50197: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78898f1640> <<< 22736 1727204236.50271: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898f29c0> <<< 22736 1727204236.50320: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # <<< 22736 1727204236.50404: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.50454: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.50581: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.50633: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.50669: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204236.50703: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 22736 1727204236.50738: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 22736 1727204236.50765: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 22736 1727204236.50855: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 22736 1727204236.50880: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 22736 1727204236.50920: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 22736 1727204236.51024: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788998ac00> <<< 22736 1727204236.51176: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898fc980> <<< 22736 1727204236.51241: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898faa20> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898fa870> <<< 22736 1727204236.51262: stdout chunk (state=3): >>># destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 22736 1727204236.51301: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.51348: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 22736 1727204236.51471: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 22736 1727204236.51493: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # <<< 22736 1727204236.51680: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22736 1727204236.51726: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.51738: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.51773: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.51836: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.51920: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.51963: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.52043: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 22736 1727204236.52062: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.52182: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.52320: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.52352: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.52421: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 22736 1727204236.52424: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.52757: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.53085: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.53157: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.53241: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204236.53283: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 22736 1727204236.53319: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 22736 1727204236.53327: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 22736 1727204236.53487: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788998d9d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 22736 1727204236.53516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 22736 1727204236.53553: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 22736 1727204236.53590: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 22736 1727204236.53604: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888eb0320> <<< 22736 1727204236.53646: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204236.53663: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7888eb0650> <<< 22736 1727204236.53738: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788996d3a0> <<< 22736 1727204236.53768: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788996c6b0> <<< 22736 1727204236.53847: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788998c170> <<< 22736 1727204236.53850: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788998f860> <<< 22736 1727204236.53867: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 22736 1727204236.53947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 22736 1727204236.53961: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 22736 1727204236.54188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7888eb3680> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888eb2f30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7888eb3110> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888eb2390> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 22736 1727204236.54303: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 22736 1727204236.54339: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888eb3770> <<< 22736 1727204236.54352: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 22736 1727204236.54403: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 22736 1727204236.54435: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7888f1e210> <<< 22736 1727204236.54495: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a605c0> <<< 22736 1727204236.54545: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788998d160> import 'ansible.module_utils.facts.timeout' # <<< 22736 1727204236.54583: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 22736 1727204236.54620: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.54642: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available <<< 22736 1727204236.54745: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.54823: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 22736 1727204236.54862: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.54932: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.55025: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 22736 1727204236.55052: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 22736 1727204236.55077: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.55124: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.55175: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 22736 1727204236.55189: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.55273: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.55350: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # <<< 22736 1727204236.55362: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.55422: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.55500: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 22736 1727204236.55503: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.55591: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.55694: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.55782: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.56088: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 22736 1727204236.56819: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.57681: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 22736 1727204236.57685: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.57772: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.57866: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.57917: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.57978: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 22736 1727204236.57999: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.58038: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.58097: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 22736 1727204236.58116: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.58195: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.58284: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 22736 1727204236.58308: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.58347: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.58403: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 22736 1727204236.58426: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.58450: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.58505: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 22736 1727204236.58523: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.58651: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.58806: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py <<< 22736 1727204236.58809: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 22736 1727204236.58843: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888f1fc80> <<< 22736 1727204236.58880: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 22736 1727204236.59080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 22736 1727204236.59135: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888f1ef60> import 'ansible.module_utils.facts.system.local' # <<< 22736 1727204236.59162: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.59262: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.59370: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 22736 1727204236.59390: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.59546: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.59710: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 22736 1727204236.59717: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.59815: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.59941: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 22736 1727204236.59961: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.60014: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.60094: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 22736 1727204236.60163: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 22736 1727204236.60281: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204236.60385: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204236.60401: stdout chunk (state=3): >>>import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7888f4a510> <<< 22736 1727204236.60758: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888f37350> import 'ansible.module_utils.facts.system.python' # <<< 22736 1727204236.60782: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.61095: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available <<< 22736 1727204236.61118: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.61247: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.61454: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.61722: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 22736 1727204236.61741: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.61802: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.61858: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 22736 1727204236.62005: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22736 1727204236.62021: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 22736 1727204236.62071: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204236.62130: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7888f65f40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888f678f0> <<< 22736 1727204236.62138: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available<<< 22736 1727204236.62184: stdout chunk (state=3): >>> # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # <<< 22736 1727204236.62187: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.62242: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.62322: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 22736 1727204236.62325: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.62605: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.62891: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 22736 1727204236.62915: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.63072: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.63478: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22736 1727204236.63695: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.63946: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 22736 1727204236.63978: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.64197: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.64423: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 22736 1727204236.64448: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.64477: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.64532: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.65593: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.66566: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 22736 1727204236.66584: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 22736 1727204236.66770: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.66987: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # <<< 22736 1727204236.66992: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.67135: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.67336: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available <<< 22736 1727204236.67605: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.67888: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 22736 1727204236.67907: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.67918: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.67928: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 22736 1727204236.67951: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.68012: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.68076: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # <<< 22736 1727204236.68093: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.68264: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.68671: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.68829: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.69226: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # <<< 22736 1727204236.69254: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available <<< 22736 1727204236.69301: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.69376: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.darwin' # <<< 22736 1727204236.69380: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.69411: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.69455: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # <<< 22736 1727204236.69458: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.69572: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.69724: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 22736 1727204236.69728: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.69780: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.69784: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.freebsd' # <<< 22736 1727204236.69806: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.69887: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.69984: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 22736 1727204236.70002: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.70088: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.70188: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 22736 1727204236.70203: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.70700: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.71202: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 22736 1727204236.71206: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.71293: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.71380: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 22736 1727204236.71406: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.71450: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.71508: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 22736 1727204236.71517: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.71673: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available <<< 22736 1727204236.71711: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 22736 1727204236.71732: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.71859: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.71993: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 22736 1727204236.72013: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.72034: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.72046: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual' # <<< 22736 1727204236.72062: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.72126: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.72196: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 22736 1727204236.72209: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.72239: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.72374: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204236.72381: stdout chunk (state=3): >>> # zipimport: zlib available <<< 22736 1727204236.72424: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.72676: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 22736 1727204236.72683: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.72758: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.72839: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 22736 1727204236.72856: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.73247: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.73622: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 22736 1727204236.73625: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.73706: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.73818: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available <<< 22736 1727204236.73865: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.73935: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 22736 1727204236.73958: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.74078: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.74220: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 22736 1727204236.74227: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 22736 1727204236.74471: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22736 1727204236.74535: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # <<< 22736 1727204236.74555: stdout chunk (state=3): >>>import 'ansible.module_utils.facts' # <<< 22736 1727204236.74722: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204236.76367: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py <<< 22736 1727204236.76397: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 22736 1727204236.76441: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204236.76460: stdout chunk (state=3): >>># extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7888d93710> <<< 22736 1727204236.76503: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888d90620><<< 22736 1727204236.76509: stdout chunk (state=3): >>> <<< 22736 1727204236.76600: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888d8c110><<< 22736 1727204236.76609: stdout chunk (state=3): >>> <<< 22736 1727204236.89922: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 22736 1727204236.89962: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888ddb2c0> <<< 22736 1727204236.89965: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py <<< 22736 1727204236.89992: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' <<< 22736 1727204236.90020: stdout chunk (state=3): >>>import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888dd97f0> <<< 22736 1727204236.90062: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py <<< 22736 1727204236.90075: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204236.90128: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' <<< 22736 1727204236.90151: stdout chunk (state=3): >>>import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888f526f0> <<< 22736 1727204236.90154: stdout chunk (state=3): >>>import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888ddb6e0> <<< 22736 1727204236.90421: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 22736 1727204237.15078: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "16", "epoch": "1727204236", "epoch_int": "1727204236", "date": "2024-09-24", "time": "14:57:16", "iso8601_micro": "2024-09-24T18:57:16.770175Z", "iso8601": "2024-09-24T18:57:16Z", "iso8601_basic": "20240924T145716770175", "iso8601_basic_short": "20240924T145716", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2841, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 876, "free": 2841}, "nocache": {"free": 3471, "used": 246}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bi<<< 22736 1727204237.15085: stdout chunk (state=3): >>>os_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 740, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146997760, "block_size": 4096, "block_total": 64479564, "block_available": 61315185, "block_used": 3164379, "inode_total": 16384000, "inode_available": 16302246, "inode_used": 81754, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 1.02001953125, "5m": 0.67578125, "15m": 0.40625}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_in<<< 22736 1727204237.15120: stdout chunk (state=3): >>>fo": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22736 1727204237.16145: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil<<< 22736 1727204237.16336: stdout chunk (state=3): >>> # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.tex<<< 22736 1727204237.16375: stdout chunk (state=3): >>>t.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 22736 1727204237.16801: stdout chunk (state=3): >>># destroy _sitebuiltins<<< 22736 1727204237.16815: stdout chunk (state=3): >>> # destroy importlib.machinery <<< 22736 1727204237.16870: stdout chunk (state=3): >>># destroy importlib._abc # destroy importlib.util # destroy _bz2 <<< 22736 1727204237.16926: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy binascii # destroy zlib<<< 22736 1727204237.16949: stdout chunk (state=3): >>> # destroy bz2 # destroy lzma # destroy zipfile._path<<< 22736 1727204237.16986: stdout chunk (state=3): >>> # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress<<< 22736 1727204237.17032: stdout chunk (state=3): >>> # destroy ntpath<<< 22736 1727204237.17101: stdout chunk (state=3): >>> # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings <<< 22736 1727204237.17141: stdout chunk (state=3): >>># destroy _locale # destroy locale <<< 22736 1727204237.17170: stdout chunk (state=3): >>># destroy select # destroy _signal # destroy _posixsubprocess<<< 22736 1727204237.17181: stdout chunk (state=3): >>> # destroy syslog # destroy uuid<<< 22736 1727204237.17263: stdout chunk (state=3): >>> # destroy _hashlib # destroy _blake2<<< 22736 1727204237.17274: stdout chunk (state=3): >>> # destroy selinux # destroy shutil <<< 22736 1727204237.17326: stdout chunk (state=3): >>># destroy distro <<< 22736 1727204237.17337: stdout chunk (state=3): >>># destroy distro.distro # destroy argparse<<< 22736 1727204237.17404: stdout chunk (state=3): >>> # destroy logging # destroy ansible.module_utils.facts.default_collectors <<< 22736 1727204237.17451: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize<<< 22736 1727204237.17515: stdout chunk (state=3): >>> # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle<<< 22736 1727204237.17520: stdout chunk (state=3): >>> # destroy _compat_pickle # destroy _pickle <<< 22736 1727204237.17543: stdout chunk (state=3): >>># destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction<<< 22736 1727204237.17602: stdout chunk (state=3): >>> # destroy selectors <<< 22736 1727204237.17613: stdout chunk (state=3): >>># destroy shlex<<< 22736 1727204237.17638: stdout chunk (state=3): >>> # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 <<< 22736 1727204237.17728: stdout chunk (state=3): >>># destroy _ssl # destroy ansible.module_utils.compat.selinux <<< 22736 1727204237.17748: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios # destroy json <<< 22736 1727204237.17832: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob<<< 22736 1727204237.17855: stdout chunk (state=3): >>> # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile<<< 22736 1727204237.17882: stdout chunk (state=3): >>> # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection<<< 22736 1727204237.17967: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.idna <<< 22736 1727204237.17997: stdout chunk (state=3): >>># destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian<<< 22736 1727204237.18037: stdout chunk (state=3): >>> # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128<<< 22736 1727204237.18070: stdout chunk (state=3): >>> # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache<<< 22736 1727204237.18099: stdout chunk (state=3): >>> # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize<<< 22736 1727204237.18126: stdout chunk (state=3): >>> # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref<<< 22736 1727204237.18214: stdout chunk (state=3): >>> # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap<<< 22736 1727204237.18223: stdout chunk (state=3): >>> # cleanup[3] wiping _struct # cleanup[3] wiping re<<< 22736 1727204237.18226: stdout chunk (state=3): >>> # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser<<< 22736 1727204237.18288: stdout chunk (state=3): >>> # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types<<< 22736 1727204237.18295: stdout chunk (state=3): >>> # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath <<< 22736 1727204237.18381: stdout chunk (state=3): >>># cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 22736 1727204237.18561: stdout chunk (state=3): >>># destroy sys.monitoring <<< 22736 1727204237.18651: stdout chunk (state=3): >>># destroy _socket <<< 22736 1727204237.18695: stdout chunk (state=3): >>># destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser<<< 22736 1727204237.18716: stdout chunk (state=3): >>> <<< 22736 1727204237.18811: stdout chunk (state=3): >>># destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg <<< 22736 1727204237.18906: stdout chunk (state=3): >>># destroy contextlib <<< 22736 1727204237.18916: stdout chunk (state=3): >>># destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse<<< 22736 1727204237.18949: stdout chunk (state=3): >>> # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 22736 1727204237.19107: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8<<< 22736 1727204237.19150: stdout chunk (state=3): >>> # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref<<< 22736 1727204237.19319: stdout chunk (state=3): >>> # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 22736 1727204237.20006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204237.20010: stdout chunk (state=3): >>><<< 22736 1727204237.20015: stderr chunk (state=3): >>><<< 22736 1727204237.20183: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788a00c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889fdbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788a00ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e210a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e21fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e5fe00> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e5fec0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e97830> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e97ec0> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e77ad0> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e751f0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e5cfb0> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889ebb770> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eba390> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e77e30> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eb8c50> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eec710> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e5c230> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889eecbc0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eeca70> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889eece60> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889e5ad50> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eed520> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eed220> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eee420> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889f08650> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889f09d90> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889f0ac90> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889f0b2f0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889f0a1e0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889f0bd70> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889f0b4a0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eee480> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889c5fc50> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889c8c650> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889c8c3b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889c8c5c0> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889c8c830> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889c5de20> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889c8df40> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889c8cbc0> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889eee600> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889cba270> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889cd23c0> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889d0b170> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889d31910> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889d0b290> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889cd3050> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889d08980> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889cd1400> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889c8ee10> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f7889b08590> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_v6r4i079/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889b6e030> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889b44f20> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889b0bfb0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889b47e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889ba19a0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889ba1730> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889ba1040> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889ba1a60> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889f0b230> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889ba26f0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889ba2930> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889ba2e70> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a00b30> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a02750> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a03110> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a03f50> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a06de0> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a06f00> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a050a0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a0ad50> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a09820> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a09580> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a0b980> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a055b0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a4ef90> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a4f0e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a54c80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a54a70> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a571a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a552e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a5e9c0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a57350> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a5fc50> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a5faa0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a5fc80> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a4f3e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a634a0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a646b0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a61c10> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7889a62f90> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a617f0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78898e86b0> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898e94c0> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889ba2840> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898e9490> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898ebef0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78898f1fa0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78898f2900> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898ead80> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f78898f1640> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898f29c0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788998ac00> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898fc980> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898faa20> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f78898fa870> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788998d9d0> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888eb0320> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7888eb0650> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788996d3a0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788996c6b0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788998c170> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788998f860> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7888eb3680> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888eb2f30> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7888eb3110> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888eb2390> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888eb3770> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7888f1e210> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7889a605c0> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7f788998d160> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888f1fc80> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888f1ef60> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7888f4a510> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888f37350> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7888f65f40> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888f678f0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f7888d93710> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888d90620> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888d8c110> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888ddb2c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888dd97f0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888f526f0> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f7888ddb6e0> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_is_chroot": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "16", "epoch": "1727204236", "epoch_int": "1727204236", "date": "2024-09-24", "time": "14:57:16", "iso8601_micro": "2024-09-24T18:57:16.770175Z", "iso8601": "2024-09-24T18:57:16Z", "iso8601_basic": "20240924T145716770175", "iso8601_basic_short": "20240924T145716", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_hostnqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2841, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 876, "free": 2841}, "nocache": {"free": 3471, "used": 246}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 740, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146997760, "block_size": 4096, "block_total": 64479564, "block_available": 61315185, "block_used": 3164379, "inode_total": 16384000, "inode_available": 16302246, "inode_used": 81754, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_lsb": {}, "ansible_loadavg": {"1m": 1.02001953125, "5m": 0.67578125, "15m": 0.40625}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed-node2 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 22736 1727204237.24765: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204235.2810752-22806-7188534212795/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204237.24769: _low_level_execute_command(): starting 22736 1727204237.24783: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204235.2810752-22806-7188534212795/ > /dev/null 2>&1 && sleep 0' 22736 1727204237.25751: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204237.25769: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204237.25794: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204237.25868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204237.25936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204237.25971: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204237.25996: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204237.26180: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204237.29108: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204237.29148: stdout chunk (state=3): >>><<< 22736 1727204237.29152: stderr chunk (state=3): >>><<< 22736 1727204237.29398: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204237.29402: handler run complete 22736 1727204237.29684: variable 'ansible_facts' from source: unknown 22736 1727204237.29971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204237.31135: variable 'ansible_facts' from source: unknown 22736 1727204237.31673: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204237.31897: attempt loop complete, returning result 22736 1727204237.32029: _execute() done 22736 1727204237.32037: dumping result to json 22736 1727204237.32072: done dumping result, returning 22736 1727204237.32121: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-4f4a-548a-00000000007c] 22736 1727204237.32154: sending task result for task 12b410aa-8751-4f4a-548a-00000000007c 22736 1727204237.33370: done sending task result for task 12b410aa-8751-4f4a-548a-00000000007c 22736 1727204237.33374: WORKER PROCESS EXITING ok: [managed-node2] 22736 1727204237.33881: no more pending results, returning what we have 22736 1727204237.33996: results queue empty 22736 1727204237.33998: checking for any_errors_fatal 22736 1727204237.34004: done checking for any_errors_fatal 22736 1727204237.34005: checking for max_fail_percentage 22736 1727204237.34007: done checking for max_fail_percentage 22736 1727204237.34008: checking to see if all hosts have failed and the running result is not ok 22736 1727204237.34009: done checking to see if all hosts have failed 22736 1727204237.34010: getting the remaining hosts for this loop 22736 1727204237.34014: done getting the remaining hosts for this loop 22736 1727204237.34019: getting the next task for host managed-node2 22736 1727204237.34027: done getting next task for host managed-node2 22736 1727204237.34029: ^ task is: TASK: meta (flush_handlers) 22736 1727204237.34031: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204237.34036: getting variables 22736 1727204237.34037: in VariableManager get_vars() 22736 1727204237.34063: Calling all_inventory to load vars for managed-node2 22736 1727204237.34066: Calling groups_inventory to load vars for managed-node2 22736 1727204237.34070: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204237.34082: Calling all_plugins_play to load vars for managed-node2 22736 1727204237.34086: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204237.34198: Calling groups_plugins_play to load vars for managed-node2 22736 1727204237.35045: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204237.36020: done with get_vars() 22736 1727204237.36035: done getting variables 22736 1727204237.36531: in VariableManager get_vars() 22736 1727204237.36545: Calling all_inventory to load vars for managed-node2 22736 1727204237.36548: Calling groups_inventory to load vars for managed-node2 22736 1727204237.36551: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204237.36557: Calling all_plugins_play to load vars for managed-node2 22736 1727204237.36560: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204237.36564: Calling groups_plugins_play to load vars for managed-node2 22736 1727204237.37627: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204237.38592: done with get_vars() 22736 1727204237.38615: done queuing things up, now waiting for results queue to drain 22736 1727204237.38618: results queue empty 22736 1727204237.38619: checking for any_errors_fatal 22736 1727204237.38622: done checking for any_errors_fatal 22736 1727204237.38623: checking for max_fail_percentage 22736 1727204237.38625: done checking for max_fail_percentage 22736 1727204237.38625: checking to see if all hosts have failed and the running result is not ok 22736 1727204237.38631: done checking to see if all hosts have failed 22736 1727204237.38632: getting the remaining hosts for this loop 22736 1727204237.38748: done getting the remaining hosts for this loop 22736 1727204237.38753: getting the next task for host managed-node2 22736 1727204237.38759: done getting next task for host managed-node2 22736 1727204237.38762: ^ task is: TASK: Include the task 'el_repo_setup.yml' 22736 1727204237.38763: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204237.38766: getting variables 22736 1727204237.38767: in VariableManager get_vars() 22736 1727204237.38778: Calling all_inventory to load vars for managed-node2 22736 1727204237.38781: Calling groups_inventory to load vars for managed-node2 22736 1727204237.38784: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204237.38791: Calling all_plugins_play to load vars for managed-node2 22736 1727204237.38794: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204237.38798: Calling groups_plugins_play to load vars for managed-node2 22736 1727204237.39462: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204237.40382: done with get_vars() 22736 1727204237.40396: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:11 Tuesday 24 September 2024 14:57:17 -0400 (0:00:02.177) 0:00:02.192 ***** 22736 1727204237.40757: entering _queue_task() for managed-node2/include_tasks 22736 1727204237.40982: Creating lock for include_tasks 22736 1727204237.41845: worker is 1 (out of 1 available) 22736 1727204237.42082: exiting _queue_task() for managed-node2/include_tasks 22736 1727204237.42095: done queuing things up, now waiting for results queue to drain 22736 1727204237.42097: waiting for pending results... 22736 1727204237.42620: running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' 22736 1727204237.42627: in run() - task 12b410aa-8751-4f4a-548a-000000000006 22736 1727204237.42630: variable 'ansible_search_path' from source: unknown 22736 1727204237.42633: calling self._execute() 22736 1727204237.42773: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204237.42996: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204237.43000: variable 'omit' from source: magic vars 22736 1727204237.43396: _execute() done 22736 1727204237.43400: dumping result to json 22736 1727204237.43403: done dumping result, returning 22736 1727204237.43406: done running TaskExecutor() for managed-node2/TASK: Include the task 'el_repo_setup.yml' [12b410aa-8751-4f4a-548a-000000000006] 22736 1727204237.43408: sending task result for task 12b410aa-8751-4f4a-548a-000000000006 22736 1727204237.43486: done sending task result for task 12b410aa-8751-4f4a-548a-000000000006 22736 1727204237.43493: WORKER PROCESS EXITING 22736 1727204237.43546: no more pending results, returning what we have 22736 1727204237.43552: in VariableManager get_vars() 22736 1727204237.43587: Calling all_inventory to load vars for managed-node2 22736 1727204237.43592: Calling groups_inventory to load vars for managed-node2 22736 1727204237.43597: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204237.43615: Calling all_plugins_play to load vars for managed-node2 22736 1727204237.43620: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204237.43625: Calling groups_plugins_play to load vars for managed-node2 22736 1727204237.44260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204237.44932: done with get_vars() 22736 1727204237.44942: variable 'ansible_search_path' from source: unknown 22736 1727204237.44958: we have included files to process 22736 1727204237.44959: generating all_blocks data 22736 1727204237.44961: done generating all_blocks data 22736 1727204237.44962: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 22736 1727204237.44964: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 22736 1727204237.44967: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 22736 1727204237.46842: in VariableManager get_vars() 22736 1727204237.46865: done with get_vars() 22736 1727204237.46881: done processing included file 22736 1727204237.46883: iterating over new_blocks loaded from include file 22736 1727204237.46885: in VariableManager get_vars() 22736 1727204237.46900: done with get_vars() 22736 1727204237.46902: filtering new block on tags 22736 1727204237.46920: done filtering new block on tags 22736 1727204237.46924: in VariableManager get_vars() 22736 1727204237.46961: done with get_vars() 22736 1727204237.46963: filtering new block on tags 22736 1727204237.46986: done filtering new block on tags 22736 1727204237.47193: in VariableManager get_vars() 22736 1727204237.47208: done with get_vars() 22736 1727204237.47210: filtering new block on tags 22736 1727204237.47227: done filtering new block on tags 22736 1727204237.47230: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed-node2 22736 1727204237.47238: extending task lists for all hosts with included blocks 22736 1727204237.47508: done extending task lists 22736 1727204237.47509: done processing included files 22736 1727204237.47511: results queue empty 22736 1727204237.47511: checking for any_errors_fatal 22736 1727204237.47513: done checking for any_errors_fatal 22736 1727204237.47514: checking for max_fail_percentage 22736 1727204237.47516: done checking for max_fail_percentage 22736 1727204237.47517: checking to see if all hosts have failed and the running result is not ok 22736 1727204237.47518: done checking to see if all hosts have failed 22736 1727204237.47519: getting the remaining hosts for this loop 22736 1727204237.47520: done getting the remaining hosts for this loop 22736 1727204237.47524: getting the next task for host managed-node2 22736 1727204237.47529: done getting next task for host managed-node2 22736 1727204237.47531: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 22736 1727204237.47534: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204237.47537: getting variables 22736 1727204237.47538: in VariableManager get_vars() 22736 1727204237.47550: Calling all_inventory to load vars for managed-node2 22736 1727204237.47554: Calling groups_inventory to load vars for managed-node2 22736 1727204237.47557: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204237.47565: Calling all_plugins_play to load vars for managed-node2 22736 1727204237.47568: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204237.47573: Calling groups_plugins_play to load vars for managed-node2 22736 1727204237.47988: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204237.48568: done with get_vars() 22736 1727204237.48580: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Tuesday 24 September 2024 14:57:17 -0400 (0:00:00.080) 0:00:02.273 ***** 22736 1727204237.48817: entering _queue_task() for managed-node2/setup 22736 1727204237.49512: worker is 1 (out of 1 available) 22736 1727204237.49525: exiting _queue_task() for managed-node2/setup 22736 1727204237.49537: done queuing things up, now waiting for results queue to drain 22736 1727204237.49539: waiting for pending results... 22736 1727204237.49928: running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test 22736 1727204237.50685: in run() - task 12b410aa-8751-4f4a-548a-00000000008d 22736 1727204237.50692: variable 'ansible_search_path' from source: unknown 22736 1727204237.50695: variable 'ansible_search_path' from source: unknown 22736 1727204237.50900: calling self._execute() 22736 1727204237.51225: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204237.51230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204237.51234: variable 'omit' from source: magic vars 22736 1727204237.53574: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204237.58387: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204237.58694: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204237.58747: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204237.58799: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204237.58932: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204237.59150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204237.59507: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204237.59511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204237.59513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204237.59516: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204237.59935: variable 'ansible_facts' from source: unknown 22736 1727204237.60029: variable 'network_test_required_facts' from source: task vars 22736 1727204237.60240: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): True 22736 1727204237.60253: variable 'omit' from source: magic vars 22736 1727204237.60313: variable 'omit' from source: magic vars 22736 1727204237.60362: variable 'omit' from source: magic vars 22736 1727204237.60694: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204237.60697: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204237.60699: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204237.60707: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204237.60723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204237.60762: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204237.60771: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204237.60780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204237.61295: Set connection var ansible_timeout to 10 22736 1727204237.61299: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204237.61301: Set connection var ansible_shell_executable to /bin/sh 22736 1727204237.61305: Set connection var ansible_shell_type to sh 22736 1727204237.61308: Set connection var ansible_pipelining to False 22736 1727204237.61311: Set connection var ansible_connection to ssh 22736 1727204237.61314: variable 'ansible_shell_executable' from source: unknown 22736 1727204237.61319: variable 'ansible_connection' from source: unknown 22736 1727204237.61322: variable 'ansible_module_compression' from source: unknown 22736 1727204237.61324: variable 'ansible_shell_type' from source: unknown 22736 1727204237.61326: variable 'ansible_shell_executable' from source: unknown 22736 1727204237.61329: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204237.61331: variable 'ansible_pipelining' from source: unknown 22736 1727204237.61334: variable 'ansible_timeout' from source: unknown 22736 1727204237.61336: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204237.61581: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204237.61594: variable 'omit' from source: magic vars 22736 1727204237.61601: starting attempt loop 22736 1727204237.61603: running the handler 22736 1727204237.61624: _low_level_execute_command(): starting 22736 1727204237.61630: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204237.63140: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204237.63165: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204237.63180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204237.63270: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204237.63317: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204237.63344: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204237.63362: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204237.63708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204237.65249: stdout chunk (state=3): >>>/root <<< 22736 1727204237.65372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204237.65456: stderr chunk (state=3): >>><<< 22736 1727204237.65809: stdout chunk (state=3): >>><<< 22736 1727204237.65840: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204237.65853: _low_level_execute_command(): starting 22736 1727204237.65861: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204237.6583977-22878-221871853001075 `" && echo ansible-tmp-1727204237.6583977-22878-221871853001075="` echo /root/.ansible/tmp/ansible-tmp-1727204237.6583977-22878-221871853001075 `" ) && sleep 0' 22736 1727204237.67422: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204237.67529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204237.67649: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204237.67663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204237.67724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204237.70123: stdout chunk (state=3): >>>ansible-tmp-1727204237.6583977-22878-221871853001075=/root/.ansible/tmp/ansible-tmp-1727204237.6583977-22878-221871853001075 <<< 22736 1727204237.70127: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204237.70235: stderr chunk (state=3): >>><<< 22736 1727204237.70239: stdout chunk (state=3): >>><<< 22736 1727204237.70259: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204237.6583977-22878-221871853001075=/root/.ansible/tmp/ansible-tmp-1727204237.6583977-22878-221871853001075 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204237.70495: variable 'ansible_module_compression' from source: unknown 22736 1727204237.70500: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22736 1727204237.71098: variable 'ansible_facts' from source: unknown 22736 1727204237.71555: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204237.6583977-22878-221871853001075/AnsiballZ_setup.py 22736 1727204237.71882: Sending initial data 22736 1727204237.71886: Sent initial data (154 bytes) 22736 1727204237.73396: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22736 1727204237.73470: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204237.73484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204237.73591: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204237.75293: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22736 1727204237.75456: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 22736 1727204237.75461: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204237.75528: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204237.75718: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp411o8my5 /root/.ansible/tmp/ansible-tmp-1727204237.6583977-22878-221871853001075/AnsiballZ_setup.py <<< 22736 1727204237.75732: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204237.6583977-22878-221871853001075/AnsiballZ_setup.py" <<< 22736 1727204237.75793: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp411o8my5" to remote "/root/.ansible/tmp/ansible-tmp-1727204237.6583977-22878-221871853001075/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204237.6583977-22878-221871853001075/AnsiballZ_setup.py" <<< 22736 1727204237.81892: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204237.82008: stderr chunk (state=3): >>><<< 22736 1727204237.82019: stdout chunk (state=3): >>><<< 22736 1727204237.82176: done transferring module to remote 22736 1727204237.82286: _low_level_execute_command(): starting 22736 1727204237.82291: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204237.6583977-22878-221871853001075/ /root/.ansible/tmp/ansible-tmp-1727204237.6583977-22878-221871853001075/AnsiballZ_setup.py && sleep 0' 22736 1727204237.83769: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204237.83898: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204237.84019: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204237.84087: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204237.86231: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204237.86465: stderr chunk (state=3): >>><<< 22736 1727204237.86469: stdout chunk (state=3): >>><<< 22736 1727204237.86472: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204237.86475: _low_level_execute_command(): starting 22736 1727204237.86477: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204237.6583977-22878-221871853001075/AnsiballZ_setup.py && sleep 0' 22736 1727204237.87203: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204237.87250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22736 1727204237.91220: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad821b44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82183ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad821b6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fa90a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fa9fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 22736 1727204237.91581: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 22736 1727204237.91604: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204237.91624: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 22736 1727204237.91660: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 22736 1727204237.91680: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 22736 1727204237.91743: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fe7ec0> <<< 22736 1727204237.91922: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fe7f80> <<< 22736 1727204237.91998: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad8201f8c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad8201ff50> <<< 22736 1727204237.92002: stdout chunk (state=3): >>>import '_collections' # <<< 22736 1727204237.92061: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fffb60> <<< 22736 1727204237.92099: stdout chunk (state=3): >>>import '_functools' # <<< 22736 1727204237.92102: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81ffd2b0> <<< 22736 1727204237.92185: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fe5070> <<< 22736 1727204237.92308: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 22736 1727204237.92311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 22736 1727204237.92602: stdout chunk (state=3): >>>import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 22736 1727204237.92607: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82043890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad820424b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81ffe2a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82040bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82074800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fe42f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad82074cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82074b60> <<< 22736 1727204237.92671: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad82074f50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fe2e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204237.92710: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82075610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad820752e0> import 'importlib.machinery' # <<< 22736 1727204237.92946: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 22736 1727204237.92950: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82076510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82090740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad82091e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 22736 1727204237.92973: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' <<< 22736 1727204237.93027: stdout chunk (state=3): >>>import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82092d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad820933e0> <<< 22736 1727204237.93056: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad820922d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 22736 1727204237.93117: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad82093e30> <<< 22736 1727204237.93120: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82093560> <<< 22736 1727204237.93177: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82076570> <<< 22736 1727204237.93181: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 22736 1727204237.93257: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 22736 1727204237.93268: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 22736 1727204237.93335: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81dc7d40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 22736 1727204237.93353: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81df07d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81df0530> <<< 22736 1727204237.93621: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81df0800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81df09e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81dc5ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 22736 1727204237.93666: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81df2000> <<< 22736 1727204237.93784: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81df0c80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82076c60> <<< 22736 1727204237.93788: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 22736 1727204237.93929: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204237.93932: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 22736 1727204237.93946: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81e1e390> <<< 22736 1727204237.94022: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 22736 1727204237.94125: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81e36540> <<< 22736 1727204237.94146: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 22736 1727204237.94198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 22736 1727204237.94287: stdout chunk (state=3): >>>import 'ntpath' # <<< 22736 1727204237.94321: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81e6f2f0> <<< 22736 1727204237.94392: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 22736 1727204237.94416: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 22736 1727204237.94576: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 22736 1727204237.94621: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81e95a90> <<< 22736 1727204237.94734: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81e6f410> <<< 22736 1727204237.94811: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81e371d0> <<< 22736 1727204237.94947: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81c6c440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81e35580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81df2f30> <<< 22736 1727204237.95103: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc'<<< 22736 1727204237.95133: stdout chunk (state=3): >>> <<< 22736 1727204237.95160: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fad81c6c6e0> <<< 22736 1727204237.95476: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_setup_payload_myngajcl/ansible_setup_payload.zip'<<< 22736 1727204237.95510: stdout chunk (state=3): >>> <<< 22736 1727204237.95638: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204237.95797: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204237.95821: stdout chunk (state=3): >>> <<< 22736 1727204237.95858: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 22736 1727204237.95898: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 22736 1727204237.96074: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 22736 1727204237.96105: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 22736 1727204237.96168: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 22736 1727204237.96190: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' <<< 22736 1727204237.96226: stdout chunk (state=3): >>>import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81cda120> import '_typing' # <<< 22736 1727204237.96238: stdout chunk (state=3): >>> <<< 22736 1727204237.96833: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81cb10a0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81cb0200> <<< 22736 1727204237.96934: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 22736 1727204237.98296: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.00442: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81cb35c0> <<< 22736 1727204238.00529: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204238.00543: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 22736 1727204238.00594: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 22736 1727204238.00750: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81d09c10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81d099a0> <<< 22736 1727204238.00788: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81d092b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81d09d00> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81cdae40> <<< 22736 1727204238.00810: stdout chunk (state=3): >>>import 'atexit' # <<< 22736 1727204238.00844: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81d0a9c0> <<< 22736 1727204238.00879: stdout chunk (state=3): >>># extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204238.00912: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81d0ac00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 22736 1727204238.00999: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' <<< 22736 1727204238.01240: stdout chunk (state=3): >>>import '_locale' # <<< 22736 1727204238.01243: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81d0b0b0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b70ec0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81b72ae0> <<< 22736 1727204238.01305: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 22736 1727204238.01355: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b73440> <<< 22736 1727204238.01414: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 22736 1727204238.01457: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b74620> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py <<< 22736 1727204238.01522: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 22736 1727204238.01547: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 22736 1727204238.01697: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b77110> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81b77470> <<< 22736 1727204238.01727: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b753d0> <<< 22736 1727204238.01767: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 22736 1727204238.01894: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 22736 1727204238.02074: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 22736 1727204238.02077: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 22736 1727204238.02080: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b7b0e0> <<< 22736 1727204238.02083: stdout chunk (state=3): >>>import '_tokenize' # <<< 22736 1727204238.02105: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b79bb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b79940> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 22736 1727204238.02280: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b79e80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b758e0> <<< 22736 1727204238.02315: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bbf260> <<< 22736 1727204238.02351: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bbf3b0> <<< 22736 1727204238.02384: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py <<< 22736 1727204238.02442: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 22736 1727204238.02446: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 22736 1727204238.02509: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bc4f80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bc4d40> <<< 22736 1727204238.02533: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 22736 1727204238.02725: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 22736 1727204238.02880: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bc74a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bc55e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 22736 1727204238.02919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204238.02984: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py <<< 22736 1727204238.02987: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 22736 1727204238.02991: stdout chunk (state=3): >>>import '_string' # <<< 22736 1727204238.03056: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bcebd0> <<< 22736 1727204238.03391: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bc7590> <<< 22736 1727204238.03436: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bcf9b0> <<< 22736 1727204238.03495: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bcfbc0> <<< 22736 1727204238.03563: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bcfce0> <<< 22736 1727204238.03619: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bbf6b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 22736 1727204238.03702: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 22736 1727204238.03728: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204238.03764: stdout chunk (state=3): >>># extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bd3620> <<< 22736 1727204238.04108: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204238.04111: stdout chunk (state=3): >>># extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bd4b30> <<< 22736 1727204238.04166: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bd1dc0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bd3140> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bd19a0> <<< 22736 1727204238.04235: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.04238: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.04395: stdout chunk (state=3): >>>import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 22736 1727204238.04421: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.04584: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.04610: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.04644: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 22736 1727204238.04778: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.04908: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.05146: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.06352: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.07631: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 22736 1727204238.07635: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 22736 1727204238.07651: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204238.07799: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81a5cb60> <<< 22736 1727204238.07907: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 22736 1727204238.07933: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81a5d940> <<< 22736 1727204238.07976: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bd4da0> <<< 22736 1727204238.08028: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # <<< 22736 1727204238.08049: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.08074: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.08115: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 22736 1727204238.08405: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.08714: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 22736 1727204238.08743: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81a5d970> # zipimport: zlib available <<< 22736 1727204238.09908: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.10863: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22736 1727204238.11129: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 22736 1727204238.11249: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.11439: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 22736 1727204238.11476: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 22736 1727204238.11502: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.11568: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.11685: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 22736 1727204238.12132: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.12610: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 22736 1727204238.12714: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 22736 1727204238.12739: stdout chunk (state=3): >>>import '_ast' # <<< 22736 1727204238.12896: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81a5e8d0> # zipimport: zlib available <<< 22736 1727204238.13026: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.13167: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 22736 1727204238.13225: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 22736 1727204238.13337: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204238.13618: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81a664e0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81a66e40> <<< 22736 1727204238.13628: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81a5ff20> <<< 22736 1727204238.13679: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.13699: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.13784: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 22736 1727204238.13788: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.13834: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.13917: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.14422: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204238.14456: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81a65b80> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81a670e0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 22736 1727204238.14562: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.14929: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 22736 1727204238.14947: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 22736 1727204238.14992: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 22736 1727204238.15072: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 22736 1727204238.15103: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81af7290> <<< 22736 1727204238.15187: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81af4050> <<< 22736 1727204238.15355: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81a720f0> <<< 22736 1727204238.15383: stdout chunk (state=3): >>>import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81a6f0b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 22736 1727204238.15463: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 22736 1727204238.15582: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available <<< 22736 1727204238.15610: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 22736 1727204238.15723: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.15971: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22736 1727204238.16020: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.16074: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.16140: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 22736 1727204238.16166: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.16311: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.16450: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.16485: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.16553: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 22736 1727204238.16580: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.16910: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.17234: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.17299: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.17391: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py <<< 22736 1727204238.17402: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204238.17445: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 22736 1727204238.17579: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81afe390> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py <<< 22736 1727204238.17608: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' <<< 22736 1727204238.17639: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py <<< 22736 1727204238.17718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' <<< 22736 1727204238.17746: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py <<< 22736 1727204238.17791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' <<< 22736 1727204238.17810: stdout chunk (state=3): >>>import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81538890> <<< 22736 1727204238.17854: stdout chunk (state=3): >>># extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204238.17906: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204238.17910: stdout chunk (state=3): >>>import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81538bc0> <<< 22736 1727204238.17982: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81add910> <<< 22736 1727204238.18031: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81adcd70> <<< 22736 1727204238.18178: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81afcaa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81afc6e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' <<< 22736 1727204238.18221: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py <<< 22736 1727204238.18234: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' <<< 22736 1727204238.18276: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py <<< 22736 1727204238.18287: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' <<< 22736 1727204238.18345: stdout chunk (state=3): >>># extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204238.18374: stdout chunk (state=3): >>># extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad8153bc50> <<< 22736 1727204238.18395: stdout chunk (state=3): >>>import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad8153b500> <<< 22736 1727204238.18424: stdout chunk (state=3): >>># extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204238.18688: stdout chunk (state=3): >>>import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad8153b6e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad8153a930> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad8153bd40> <<< 22736 1727204238.18716: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 22736 1727204238.18762: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 22736 1727204238.18832: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204238.18844: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad8159a840> <<< 22736 1727204238.18883: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81598860> <<< 22736 1727204238.18947: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81afc7d0> <<< 22736 1727204238.18977: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # <<< 22736 1727204238.19007: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.19035: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 22736 1727204238.19075: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.19181: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.19302: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 22736 1727204238.19330: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.19402: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.19476: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 22736 1727204238.19506: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.19524: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.19568: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system' # # zipimport: zlib available <<< 22736 1727204238.19623: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.19691: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # <<< 22736 1727204238.19708: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.19777: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.20097: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available <<< 22736 1727204238.20119: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.20195: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.20292: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.20398: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 22736 1727204238.20428: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 22736 1727204238.21365: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.22229: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 22736 1727204238.22243: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.22331: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.22432: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.22462: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.22520: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # <<< 22736 1727204238.22566: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.22600: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.22628: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available <<< 22736 1727204238.22722: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.22816: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.dns' # <<< 22736 1727204238.22833: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.22875: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.22934: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 22736 1727204238.22938: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.22963: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.23020: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 22736 1727204238.23032: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.23156: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.23309: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' <<< 22736 1727204238.23335: stdout chunk (state=3): >>>import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad8159aea0> <<< 22736 1727204238.23383: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 22736 1727204238.23420: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 22736 1727204238.23641: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad8159bb90> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 22736 1727204238.23752: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.23863: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available <<< 22736 1727204238.24204: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # <<< 22736 1727204238.24208: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.24298: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.24415: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 22736 1727204238.24434: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.24485: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.24573: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py <<< 22736 1727204238.24644: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 22736 1727204238.24752: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204238.25011: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad815d6cc0> <<< 22736 1727204238.25196: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad815bf5f0> <<< 22736 1727204238.25218: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available <<< 22736 1727204238.25308: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.25401: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 22736 1727204238.25414: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.25552: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.25688: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.25910: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.26155: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # <<< 22736 1727204238.26177: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.26245: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.26299: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.ssh_pub_keys' # <<< 22736 1727204238.26376: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22736 1727204238.26464: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py <<< 22736 1727204238.26476: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' <<< 22736 1727204238.26523: stdout chunk (state=3): >>># extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204238.26543: stdout chunk (state=3): >>># extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad80ede480> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad80ede3f0> <<< 22736 1727204238.26574: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.user' # <<< 22736 1727204238.26688: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available <<< 22736 1727204238.26719: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.base' # <<< 22736 1727204238.26740: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.27026: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.27295: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 22736 1727204238.27308: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.27479: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.27652: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.27719: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.27817: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 22736 1727204238.27821: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 22736 1727204238.27839: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.27877: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.28131: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.28480: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available <<< 22736 1727204238.28610: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.28879: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available <<< 22736 1727204238.28900: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.28956: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.30122: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.31052: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # <<< 22736 1727204238.31106: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.31307: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.31587: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 22736 1727204238.31780: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.31882: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 22736 1727204238.31953: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.32197: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.32514: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available <<< 22736 1727204238.32631: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network' # <<< 22736 1727204238.32635: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22736 1727204238.32776: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available <<< 22736 1727204238.32868: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.33057: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.33476: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.33813: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 22736 1727204238.33895: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.33941: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # <<< 22736 1727204238.34210: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.34214: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available <<< 22736 1727204238.34482: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available <<< 22736 1727204238.34590: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available <<< 22736 1727204238.35012: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.35016: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available <<< 22736 1727204238.35304: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.35849: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 22736 1727204238.35976: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 22736 1727204238.36073: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 22736 1727204238.36088: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.36149: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.36209: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.nvme' # <<< 22736 1727204238.36330: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # <<< 22736 1727204238.36346: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.36394: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.36449: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.openbsd' # <<< 22736 1727204238.36462: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.36606: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.36801: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 22736 1727204238.36853: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.36930: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # <<< 22736 1727204238.36944: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.36977: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.37175: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 22736 1727204238.37293: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.37426: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 22736 1727204238.37430: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # <<< 22736 1727204238.37453: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.37517: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.37596: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # <<< 22736 1727204238.37610: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.38017: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.38598: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # <<< 22736 1727204238.38610: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.38612: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.38796: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.openbsd' # <<< 22736 1727204238.38800: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.38843: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.38976: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # <<< 22736 1727204238.38998: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.default_collectors' # <<< 22736 1727204238.39010: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.39151: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.39311: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 22736 1727204238.39437: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204238.40292: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py <<< 22736 1727204238.40604: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad80f07a40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad80f04950> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad80f054f0> <<< 22736 1727204238.41853: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "18", "epoch": "1727204238", "epoch_int": "1727204238", "date": "2024-09-24", "time": "14:57:18", "iso8601_micro": "2024-09-24T18:57:18.396513Z", "iso8601": "2024-09-24T18:57:18Z", "iso8601_basic": "20240924T145718396513", "iso8601_basic_short": "20240924T145718", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22736 1727204238.42582: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 22736 1727204238.42640: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread <<< 22736 1727204238.42662: stdout chunk (state=3): >>># cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io <<< 22736 1727204238.42921: stdout chunk (state=3): >>># cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast <<< 22736 1727204238.43107: stdout chunk (state=3): >>># destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr <<< 22736 1727204238.43128: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base<<< 22736 1727204238.43297: stdout chunk (state=3): >>> # cleanup[2] removing ansible.module_utils.facts.hardware.aix <<< 22736 1727204238.43306: stdout chunk (state=3): >>># cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna <<< 22736 1727204238.43883: stdout chunk (state=3): >>># destroy _sitebuiltins <<< 22736 1727204238.44096: stdout chunk (state=3): >>># destroy importlib.machinery # destroy importlib._abc # destroy importlib.util <<< 22736 1727204238.44101: stdout chunk (state=3): >>># destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress <<< 22736 1727204238.44110: stdout chunk (state=3): >>># destroy ntpath # destroy importlib <<< 22736 1727204238.44131: stdout chunk (state=3): >>># destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp <<< 22736 1727204238.44183: stdout chunk (state=3): >>># destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid <<< 22736 1727204238.44284: stdout chunk (state=3): >>># destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 22736 1727204238.44348: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool <<< 22736 1727204238.44531: stdout chunk (state=3): >>># destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux <<< 22736 1727204238.44612: stdout chunk (state=3): >>># destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket <<< 22736 1727204238.44693: stdout chunk (state=3): >>># destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna<<< 22736 1727204238.44832: stdout chunk (state=3): >>> # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket <<< 22736 1727204238.45054: stdout chunk (state=3): >>># cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 22736 1727204238.45237: stdout chunk (state=3): >>># destroy sys.monitoring <<< 22736 1727204238.45249: stdout chunk (state=3): >>># destroy _socket <<< 22736 1727204238.45276: stdout chunk (state=3): >>># destroy _collections <<< 22736 1727204238.45383: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 22736 1727204238.45421: stdout chunk (state=3): >>># destroy _typing <<< 22736 1727204238.45443: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves <<< 22736 1727204238.45500: stdout chunk (state=3): >>># destroy _frozen_importlib_external # destroy _imp<<< 22736 1727204238.45682: stdout chunk (state=3): >>> # destroy _io # destroy marshal<<< 22736 1727204238.45916: stdout chunk (state=3): >>> # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 22736 1727204238.46411: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204238.46444: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 22736 1727204238.46524: stderr chunk (state=3): >>><<< 22736 1727204238.46674: stdout chunk (state=3): >>><<< 22736 1727204238.46924: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad821b44d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82183ad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad821b6a20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fa90a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fa9fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fe7ec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fe7f80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad8201f8c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad8201ff50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fffb60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81ffd2b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fe5070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82043890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad820424b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81ffe2a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82040bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82074800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fe42f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad82074cb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82074b60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad82074f50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81fe2e10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82075610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad820752e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82076510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82090740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad82091e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82092d80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad820933e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad820922d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad82093e30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82093560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82076570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81dc7d40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81df07d0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81df0530> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81df0800> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81df09e0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81dc5ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81df2000> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81df0c80> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad82076c60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81e1e390> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81e36540> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81e6f2f0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81e95a90> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81e6f410> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81e371d0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81c6c440> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81e35580> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81df2f30> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7fad81c6c6e0> # zipimport: found 103 names in '/tmp/ansible_setup_payload_myngajcl/ansible_setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81cda120> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81cb10a0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81cb0200> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81cb35c0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81d09c10> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81d099a0> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81d092b0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81d09d00> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81cdae40> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81d0a9c0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81d0ac00> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81d0b0b0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b70ec0> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81b72ae0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b73440> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b74620> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b77110> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81b77470> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b753d0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b7b0e0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b79bb0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b79940> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b79e80> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81b758e0> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bbf260> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bbf3b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bc4f80> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bc4d40> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bc74a0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bc55e0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bcebd0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bc7590> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bcf9b0> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bcfbc0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bcfce0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bbf6b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bd3620> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bd4b30> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bd1dc0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81bd3140> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bd19a0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81a5cb60> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81a5d940> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81bd4da0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81a5d970> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81a5e8d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81a664e0> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81a66e40> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81a5ff20> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81a65b80> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81a670e0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81af7290> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81af4050> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81a720f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81a6f0b0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81afe390> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81538890> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad81538bc0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81add910> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81adcd70> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81afcaa0> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81afc6e0> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad8153bc50> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad8153b500> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad8153b6e0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad8153a930> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad8153bd40> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad8159a840> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81598860> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad81afc7d0> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad8159aea0> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad8159bb90> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad815d6cc0> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad815bf5f0> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad80ede480> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad80ede3f0> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7fad80f07a40> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad80f04950> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7fad80f054f0> {"ansible_facts": {"ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "18", "epoch": "1727204238", "epoch_int": "1727204238", "date": "2024-09-24", "time": "14:57:18", "iso8601_micro": "2024-09-24T18:57:18.396513Z", "iso8601": "2024-09-24T18:57:18Z", "iso8601_basic": "20240924T145718396513", "iso8601_basic_short": "20240924T145718", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_local": {}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_apparmor": {"status": "disabled"}, "ansible_pkg_mgr": "dnf", "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_lsb": {}, "ansible_fips": false, "ansible_service_mgr": "systemd", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "gather_subset": ["min"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["min"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.connection # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy multiprocessing.context # destroy array # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.process # destroy unicodedata # destroy tempfile # destroy multiprocessing.util # destroy multiprocessing.reduction # destroy selectors # destroy _multiprocessing # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy errno # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 22736 1727204238.49022: done with _execute_module (setup, {'gather_subset': 'min', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204237.6583977-22878-221871853001075/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204238.49026: _low_level_execute_command(): starting 22736 1727204238.49028: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204237.6583977-22878-221871853001075/ > /dev/null 2>&1 && sleep 0' 22736 1727204238.49161: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204238.49165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204238.49168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 22736 1727204238.49170: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204238.49278: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204238.49343: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204238.49388: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204238.49497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204238.49567: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22736 1727204238.52546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204238.52612: stderr chunk (state=3): >>><<< 22736 1727204238.52623: stdout chunk (state=3): >>><<< 22736 1727204238.52897: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22736 1727204238.52901: handler run complete 22736 1727204238.52903: variable 'ansible_facts' from source: unknown 22736 1727204238.53021: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204238.53508: variable 'ansible_facts' from source: unknown 22736 1727204238.53512: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204238.53686: attempt loop complete, returning result 22736 1727204238.53824: _execute() done 22736 1727204238.53839: dumping result to json 22736 1727204238.53860: done dumping result, returning 22736 1727204238.54051: done running TaskExecutor() for managed-node2/TASK: Gather the minimum subset of ansible_facts required by the network role test [12b410aa-8751-4f4a-548a-00000000008d] 22736 1727204238.54055: sending task result for task 12b410aa-8751-4f4a-548a-00000000008d ok: [managed-node2] 22736 1727204238.54754: no more pending results, returning what we have 22736 1727204238.54758: results queue empty 22736 1727204238.54759: checking for any_errors_fatal 22736 1727204238.54761: done checking for any_errors_fatal 22736 1727204238.54762: checking for max_fail_percentage 22736 1727204238.54764: done checking for max_fail_percentage 22736 1727204238.54765: checking to see if all hosts have failed and the running result is not ok 22736 1727204238.54766: done checking to see if all hosts have failed 22736 1727204238.54767: getting the remaining hosts for this loop 22736 1727204238.54769: done getting the remaining hosts for this loop 22736 1727204238.54773: getting the next task for host managed-node2 22736 1727204238.54784: done getting next task for host managed-node2 22736 1727204238.54787: ^ task is: TASK: Check if system is ostree 22736 1727204238.55007: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204238.55012: getting variables 22736 1727204238.55014: in VariableManager get_vars() 22736 1727204238.55107: Calling all_inventory to load vars for managed-node2 22736 1727204238.55111: Calling groups_inventory to load vars for managed-node2 22736 1727204238.55115: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204238.55357: Calling all_plugins_play to load vars for managed-node2 22736 1727204238.55361: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204238.55367: Calling groups_plugins_play to load vars for managed-node2 22736 1727204238.55895: done sending task result for task 12b410aa-8751-4f4a-548a-00000000008d 22736 1727204238.55899: WORKER PROCESS EXITING 22736 1727204238.55936: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204238.56516: done with get_vars() 22736 1727204238.56529: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Tuesday 24 September 2024 14:57:18 -0400 (0:00:01.078) 0:00:03.351 ***** 22736 1727204238.56715: entering _queue_task() for managed-node2/stat 22736 1727204238.57125: worker is 1 (out of 1 available) 22736 1727204238.57138: exiting _queue_task() for managed-node2/stat 22736 1727204238.57150: done queuing things up, now waiting for results queue to drain 22736 1727204238.57151: waiting for pending results... 22736 1727204238.57336: running TaskExecutor() for managed-node2/TASK: Check if system is ostree 22736 1727204238.57472: in run() - task 12b410aa-8751-4f4a-548a-00000000008f 22736 1727204238.57496: variable 'ansible_search_path' from source: unknown 22736 1727204238.57504: variable 'ansible_search_path' from source: unknown 22736 1727204238.57548: calling self._execute() 22736 1727204238.57639: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204238.57653: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204238.57674: variable 'omit' from source: magic vars 22736 1727204238.58256: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204238.58658: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204238.58718: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204238.58773: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204238.58836: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204238.59085: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204238.59090: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204238.59096: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204238.59098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204238.59541: Evaluated conditional (not __network_is_ostree is defined): True 22736 1727204238.59546: variable 'omit' from source: magic vars 22736 1727204238.59627: variable 'omit' from source: magic vars 22736 1727204238.59733: variable 'omit' from source: magic vars 22736 1727204238.59974: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204238.59978: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204238.59980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204238.59982: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204238.59984: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204238.59986: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204238.60094: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204238.60105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204238.60347: Set connection var ansible_timeout to 10 22736 1727204238.60425: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204238.60516: Set connection var ansible_shell_executable to /bin/sh 22736 1727204238.60520: Set connection var ansible_shell_type to sh 22736 1727204238.60522: Set connection var ansible_pipelining to False 22736 1727204238.60524: Set connection var ansible_connection to ssh 22736 1727204238.60527: variable 'ansible_shell_executable' from source: unknown 22736 1727204238.60529: variable 'ansible_connection' from source: unknown 22736 1727204238.60531: variable 'ansible_module_compression' from source: unknown 22736 1727204238.60533: variable 'ansible_shell_type' from source: unknown 22736 1727204238.60606: variable 'ansible_shell_executable' from source: unknown 22736 1727204238.60619: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204238.60639: variable 'ansible_pipelining' from source: unknown 22736 1727204238.60681: variable 'ansible_timeout' from source: unknown 22736 1727204238.60694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204238.60941: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204238.60965: variable 'omit' from source: magic vars 22736 1727204238.60995: starting attempt loop 22736 1727204238.60999: running the handler 22736 1727204238.61015: _low_level_execute_command(): starting 22736 1727204238.61063: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204238.61892: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204238.61956: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204238.62024: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204238.62044: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204238.62097: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22736 1727204238.64984: stdout chunk (state=3): >>>/root <<< 22736 1727204238.64991: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204238.64995: stdout chunk (state=3): >>><<< 22736 1727204238.64998: stderr chunk (state=3): >>><<< 22736 1727204238.65001: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22736 1727204238.65010: _low_level_execute_command(): starting 22736 1727204238.65016: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204238.6487117-22914-55779580031446 `" && echo ansible-tmp-1727204238.6487117-22914-55779580031446="` echo /root/.ansible/tmp/ansible-tmp-1727204238.6487117-22914-55779580031446 `" ) && sleep 0' 22736 1727204238.65880: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204238.65932: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204238.65953: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204238.65980: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204238.66008: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204238.66105: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204238.66161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204238.66185: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204238.66292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22736 1727204238.69401: stdout chunk (state=3): >>>ansible-tmp-1727204238.6487117-22914-55779580031446=/root/.ansible/tmp/ansible-tmp-1727204238.6487117-22914-55779580031446 <<< 22736 1727204238.69680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204238.69685: stdout chunk (state=3): >>><<< 22736 1727204238.69688: stderr chunk (state=3): >>><<< 22736 1727204238.69900: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204238.6487117-22914-55779580031446=/root/.ansible/tmp/ansible-tmp-1727204238.6487117-22914-55779580031446 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22736 1727204238.69904: variable 'ansible_module_compression' from source: unknown 22736 1727204238.69906: ANSIBALLZ: Using lock for stat 22736 1727204238.69909: ANSIBALLZ: Acquiring lock 22736 1727204238.69911: ANSIBALLZ: Lock acquired: 140553537015680 22736 1727204238.69916: ANSIBALLZ: Creating module 22736 1727204239.00768: ANSIBALLZ: Writing module into payload 22736 1727204239.00906: ANSIBALLZ: Writing module 22736 1727204239.00938: ANSIBALLZ: Renaming module 22736 1727204239.00950: ANSIBALLZ: Done creating module 22736 1727204239.00974: variable 'ansible_facts' from source: unknown 22736 1727204239.01057: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204238.6487117-22914-55779580031446/AnsiballZ_stat.py 22736 1727204239.01328: Sending initial data 22736 1727204239.01331: Sent initial data (152 bytes) 22736 1727204239.02009: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204239.02086: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204239.02124: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204239.02198: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22736 1727204239.04827: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204238.6487117-22914-55779580031446/AnsiballZ_stat.py" <<< 22736 1727204239.04832: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp1gc36rpx /root/.ansible/tmp/ansible-tmp-1727204238.6487117-22914-55779580031446/AnsiballZ_stat.py <<< 22736 1727204239.04896: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 22736 1727204239.04911: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp1gc36rpx" to remote "/root/.ansible/tmp/ansible-tmp-1727204238.6487117-22914-55779580031446/AnsiballZ_stat.py" <<< 22736 1727204239.04927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204238.6487117-22914-55779580031446/AnsiballZ_stat.py" <<< 22736 1727204239.06277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204239.06424: stderr chunk (state=3): >>><<< 22736 1727204239.06441: stdout chunk (state=3): >>><<< 22736 1727204239.06472: done transferring module to remote 22736 1727204239.06498: _low_level_execute_command(): starting 22736 1727204239.06523: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204238.6487117-22914-55779580031446/ /root/.ansible/tmp/ansible-tmp-1727204238.6487117-22914-55779580031446/AnsiballZ_stat.py && sleep 0' 22736 1727204239.07253: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204239.07282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204239.07304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204239.07403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204239.07442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204239.07460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204239.07487: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204239.07574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22736 1727204239.10549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204239.10554: stdout chunk (state=3): >>><<< 22736 1727204239.10557: stderr chunk (state=3): >>><<< 22736 1727204239.10680: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22736 1727204239.10684: _low_level_execute_command(): starting 22736 1727204239.10687: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204238.6487117-22914-55779580031446/AnsiballZ_stat.py && sleep 0' 22736 1727204239.11616: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204239.11642: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204239.11659: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204239.11678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204239.11704: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204239.11759: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204239.11826: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204239.11853: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204239.11880: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204239.11980: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22736 1727204239.15353: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 22736 1727204239.15428: stdout chunk (state=3): >>>import _imp # builtin<<< 22736 1727204239.15455: stdout chunk (state=3): >>> <<< 22736 1727204239.15477: stdout chunk (state=3): >>>import '_thread' # <<< 22736 1727204239.15517: stdout chunk (state=3): >>>import '_warnings' # <<< 22736 1727204239.15633: stdout chunk (state=3): >>>import '_weakref' # import '_io' # <<< 22736 1727204239.15637: stdout chunk (state=3): >>> <<< 22736 1727204239.15658: stdout chunk (state=3): >>>import 'marshal' # <<< 22736 1727204239.15723: stdout chunk (state=3): >>>import 'posix' # <<< 22736 1727204239.15743: stdout chunk (state=3): >>> <<< 22736 1727204239.15795: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 22736 1727204239.15823: stdout chunk (state=3): >>> # installing zipimport hook <<< 22736 1727204239.15858: stdout chunk (state=3): >>>import 'time' # <<< 22736 1727204239.15884: stdout chunk (state=3): >>> import 'zipimport' # <<< 22736 1727204239.15984: stdout chunk (state=3): >>> # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py <<< 22736 1727204239.16019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc'<<< 22736 1727204239.16022: stdout chunk (state=3): >>> <<< 22736 1727204239.16079: stdout chunk (state=3): >>>import '_codecs' # <<< 22736 1727204239.16083: stdout chunk (state=3): >>> <<< 22736 1727204239.16124: stdout chunk (state=3): >>>import 'codecs' # <<< 22736 1727204239.16127: stdout chunk (state=3): >>> <<< 22736 1727204239.16195: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py<<< 22736 1727204239.16258: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 22736 1727204239.16262: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa12c4d0><<< 22736 1727204239.16316: stdout chunk (state=3): >>> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa0fbad0><<< 22736 1727204239.16319: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py<<< 22736 1727204239.16348: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 22736 1727204239.16362: stdout chunk (state=3): >>> import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa12ea20> <<< 22736 1727204239.16442: stdout chunk (state=3): >>>import '_signal' # import '_abc' # <<< 22736 1727204239.16447: stdout chunk (state=3): >>> import 'abc' # <<< 22736 1727204239.16460: stdout chunk (state=3): >>> <<< 22736 1727204239.16524: stdout chunk (state=3): >>>import 'io' # import '_stat' # <<< 22736 1727204239.16549: stdout chunk (state=3): >>> import 'stat' # <<< 22736 1727204239.16693: stdout chunk (state=3): >>>import '_collections_abc' # <<< 22736 1727204239.16740: stdout chunk (state=3): >>> import 'genericpath' # <<< 22736 1727204239.16764: stdout chunk (state=3): >>> import 'posixpath' # <<< 22736 1727204239.16788: stdout chunk (state=3): >>> <<< 22736 1727204239.16833: stdout chunk (state=3): >>>import 'os' # import '_sitebuiltins' # <<< 22736 1727204239.16872: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages<<< 22736 1727204239.16910: stdout chunk (state=3): >>> Adding directory: '/usr/local/lib/python3.12/site-packages'<<< 22736 1727204239.16913: stdout chunk (state=3): >>> Adding directory: '/usr/lib64/python3.12/site-packages'<<< 22736 1727204239.16947: stdout chunk (state=3): >>> Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 22736 1727204239.17010: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 22736 1727204239.17061: stdout chunk (state=3): >>> import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f410a0><<< 22736 1727204239.17065: stdout chunk (state=3): >>> <<< 22736 1727204239.17184: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py<<< 22736 1727204239.17189: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc'<<< 22736 1727204239.17207: stdout chunk (state=3): >>> import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f41fd0><<< 22736 1727204239.17268: stdout chunk (state=3): >>> import 'site' # <<< 22736 1727204239.17272: stdout chunk (state=3): >>> <<< 22736 1727204239.17331: stdout chunk (state=3): >>>Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux <<< 22736 1727204239.17334: stdout chunk (state=3): >>>Type "help", "copyright", "credits" or "license" for more information.<<< 22736 1727204239.17381: stdout chunk (state=3): >>> <<< 22736 1727204239.17740: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py<<< 22736 1727204239.17766: stdout chunk (state=3): >>> <<< 22736 1727204239.17794: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 22736 1727204239.17826: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 22736 1727204239.17894: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py<<< 22736 1727204239.17898: stdout chunk (state=3): >>> <<< 22736 1727204239.17951: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc'<<< 22736 1727204239.17994: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 22736 1727204239.18038: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc'<<< 22736 1727204239.18067: stdout chunk (state=3): >>> import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f7fec0> <<< 22736 1727204239.18133: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc'<<< 22736 1727204239.18149: stdout chunk (state=3): >>> <<< 22736 1727204239.18178: stdout chunk (state=3): >>>import '_operator' # <<< 22736 1727204239.18230: stdout chunk (state=3): >>>import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f7ff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 22736 1727204239.18332: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 22736 1727204239.18418: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204239.18493: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py <<< 22736 1727204239.18518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc'<<< 22736 1727204239.18560: stdout chunk (state=3): >>> import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9fb78c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py<<< 22736 1727204239.18586: stdout chunk (state=3): >>> <<< 22736 1727204239.18614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9fb7f50><<< 22736 1727204239.18636: stdout chunk (state=3): >>> import '_collections' # <<< 22736 1727204239.18729: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f97b60> import '_functools' # <<< 22736 1727204239.18788: stdout chunk (state=3): >>> import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f952b0> <<< 22736 1727204239.18942: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f7d070> <<< 22736 1727204239.18994: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 22736 1727204239.19049: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # <<< 22736 1727204239.19063: stdout chunk (state=3): >>> <<< 22736 1727204239.19124: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc'<<< 22736 1727204239.19162: stdout chunk (state=3): >>> <<< 22736 1727204239.19165: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py<<< 22736 1727204239.19182: stdout chunk (state=3): >>> <<< 22736 1727204239.19239: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9fdb890><<< 22736 1727204239.19272: stdout chunk (state=3): >>> <<< 22736 1727204239.19276: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9fda4b0><<< 22736 1727204239.19313: stdout chunk (state=3): >>> <<< 22736 1727204239.19317: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 22736 1727204239.19356: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f962a0><<< 22736 1727204239.19360: stdout chunk (state=3): >>> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9fd8bc0><<< 22736 1727204239.19428: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 22736 1727204239.19460: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc'<<< 22736 1727204239.19488: stdout chunk (state=3): >>> import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa00c800> <<< 22736 1727204239.19512: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f7c2f0> <<< 22736 1727204239.19546: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc'<<< 22736 1727204239.19578: stdout chunk (state=3): >>> <<< 22736 1727204239.19617: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.19636: stdout chunk (state=3): >>> # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.19696: stdout chunk (state=3): >>> import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13aa00ccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa00cb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.19716: stdout chunk (state=3): >>> <<< 22736 1727204239.19787: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13aa00cf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f7ae10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204239.19810: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 22736 1727204239.19868: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa00d610><<< 22736 1727204239.19901: stdout chunk (state=3): >>> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa00d2e0><<< 22736 1727204239.19905: stdout chunk (state=3): >>> import 'importlib.machinery' # <<< 22736 1727204239.19981: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py<<< 22736 1727204239.19985: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc'<<< 22736 1727204239.20013: stdout chunk (state=3): >>> <<< 22736 1727204239.20017: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa00e510><<< 22736 1727204239.20049: stdout chunk (state=3): >>> import 'importlib.util' # <<< 22736 1727204239.20078: stdout chunk (state=3): >>>import 'runpy' # <<< 22736 1727204239.20127: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py<<< 22736 1727204239.20142: stdout chunk (state=3): >>> <<< 22736 1727204239.20188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc'<<< 22736 1727204239.20230: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 22736 1727204239.20261: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa028740><<< 22736 1727204239.20303: stdout chunk (state=3): >>> import 'errno' # <<< 22736 1727204239.20315: stdout chunk (state=3): >>> <<< 22736 1727204239.20357: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.20407: stdout chunk (state=3): >>> import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13aa029e80><<< 22736 1727204239.20410: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py<<< 22736 1727204239.20428: stdout chunk (state=3): >>> <<< 22736 1727204239.20455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py<<< 22736 1727204239.20478: stdout chunk (state=3): >>> <<< 22736 1727204239.20515: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc'<<< 22736 1727204239.20534: stdout chunk (state=3): >>> import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa02ad80> <<< 22736 1727204239.20564: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.20610: stdout chunk (state=3): >>> import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13aa02b3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa02a2d0><<< 22736 1727204239.20643: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py<<< 22736 1727204239.20669: stdout chunk (state=3): >>> <<< 22736 1727204239.20691: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc'<<< 22736 1727204239.20736: stdout chunk (state=3): >>> # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204239.20771: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204239.20803: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13aa02be30> <<< 22736 1727204239.20900: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa02b560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa00e570><<< 22736 1727204239.20917: stdout chunk (state=3): >>> <<< 22736 1727204239.20948: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py<<< 22736 1727204239.20961: stdout chunk (state=3): >>> <<< 22736 1727204239.21005: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc'<<< 22736 1727204239.21010: stdout chunk (state=3): >>> <<< 22736 1727204239.21032: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 22736 1727204239.21085: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc'<<< 22736 1727204239.21101: stdout chunk (state=3): >>> <<< 22736 1727204239.21133: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9debd40><<< 22736 1727204239.21175: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py <<< 22736 1727204239.21225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204239.21246: stdout chunk (state=3): >>># extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9e14860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e145c0><<< 22736 1727204239.21301: stdout chunk (state=3): >>> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.21330: stdout chunk (state=3): >>> # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9e14890><<< 22736 1727204239.21361: stdout chunk (state=3): >>> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.21376: stdout chunk (state=3): >>> # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.21421: stdout chunk (state=3): >>> import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9e14a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9de9ee0> <<< 22736 1727204239.21440: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py<<< 22736 1727204239.21584: stdout chunk (state=3): >>> <<< 22736 1727204239.21665: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc'<<< 22736 1727204239.21678: stdout chunk (state=3): >>> <<< 22736 1727204239.21713: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 22736 1727204239.21741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 22736 1727204239.21779: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e16180> <<< 22736 1727204239.21813: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e14e00><<< 22736 1727204239.21825: stdout chunk (state=3): >>> <<< 22736 1727204239.21865: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa00ec60><<< 22736 1727204239.21897: stdout chunk (state=3): >>> <<< 22736 1727204239.21921: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py<<< 22736 1727204239.21932: stdout chunk (state=3): >>> <<< 22736 1727204239.22017: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc'<<< 22736 1727204239.22031: stdout chunk (state=3): >>> <<< 22736 1727204239.22131: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc'<<< 22736 1727204239.22135: stdout chunk (state=3): >>> <<< 22736 1727204239.22194: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e3e510> <<< 22736 1727204239.22279: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py<<< 22736 1727204239.22326: stdout chunk (state=3): >>> <<< 22736 1727204239.22330: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc'<<< 22736 1727204239.22347: stdout chunk (state=3): >>> <<< 22736 1727204239.22394: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc'<<< 22736 1727204239.22418: stdout chunk (state=3): >>> <<< 22736 1727204239.22483: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e5a690><<< 22736 1727204239.22506: stdout chunk (state=3): >>> <<< 22736 1727204239.22593: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc'<<< 22736 1727204239.22597: stdout chunk (state=3): >>> <<< 22736 1727204239.22707: stdout chunk (state=3): >>>import 'ntpath' # <<< 22736 1727204239.22710: stdout chunk (state=3): >>> <<< 22736 1727204239.22774: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py<<< 22736 1727204239.22778: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc'<<< 22736 1727204239.22812: stdout chunk (state=3): >>> import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e8f410> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 22736 1727204239.22886: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc'<<< 22736 1727204239.22920: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 22736 1727204239.23015: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc'<<< 22736 1727204239.23181: stdout chunk (state=3): >>> import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9eb9bb0><<< 22736 1727204239.23284: stdout chunk (state=3): >>> <<< 22736 1727204239.23353: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e8f530><<< 22736 1727204239.23357: stdout chunk (state=3): >>> <<< 22736 1727204239.23422: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e5b320> <<< 22736 1727204239.23478: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py<<< 22736 1727204239.23505: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc'<<< 22736 1727204239.23533: stdout chunk (state=3): >>> import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9c944a0><<< 22736 1727204239.23554: stdout chunk (state=3): >>> <<< 22736 1727204239.23602: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e596d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e170b0><<< 22736 1727204239.23614: stdout chunk (state=3): >>> <<< 22736 1727204239.23789: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc'<<< 22736 1727204239.23827: stdout chunk (state=3): >>> <<< 22736 1727204239.23830: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f13a9e597f0><<< 22736 1727204239.23875: stdout chunk (state=3): >>> <<< 22736 1727204239.23999: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_3qo0iess/ansible_stat_payload.zip' <<< 22736 1727204239.24033: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.24074: stdout chunk (state=3): >>> <<< 22736 1727204239.24371: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.24416: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py<<< 22736 1727204239.24431: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 22736 1727204239.24578: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 22736 1727204239.24658: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 22736 1727204239.24730: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py <<< 22736 1727204239.24761: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9cee0c0><<< 22736 1727204239.24781: stdout chunk (state=3): >>> import '_typing' # <<< 22736 1727204239.24978: stdout chunk (state=3): >>> <<< 22736 1727204239.25122: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9cc5040><<< 22736 1727204239.25125: stdout chunk (state=3): >>> <<< 22736 1727204239.25184: stdout chunk (state=3): >>>import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9cc41a0> # zipimport: zlib available<<< 22736 1727204239.25200: stdout chunk (state=3): >>> <<< 22736 1727204239.25238: stdout chunk (state=3): >>>import 'ansible' # <<< 22736 1727204239.25241: stdout chunk (state=3): >>> <<< 22736 1727204239.25271: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.25316: stdout chunk (state=3): >>> # zipimport: zlib available <<< 22736 1727204239.25350: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.25373: stdout chunk (state=3): >>> <<< 22736 1727204239.25412: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 22736 1727204239.25437: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204239.27992: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204239.30269: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9cc7fe0><<< 22736 1727204239.30274: stdout chunk (state=3): >>> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py <<< 22736 1727204239.30286: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204239.30317: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py<<< 22736 1727204239.30341: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' <<< 22736 1727204239.30390: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc'<<< 22736 1727204239.30445: stdout chunk (state=3): >>> # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204239.30470: stdout chunk (state=3): >>># extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9d19b80><<< 22736 1727204239.30533: stdout chunk (state=3): >>> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9d19910><<< 22736 1727204239.30555: stdout chunk (state=3): >>> <<< 22736 1727204239.30605: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9d19220> <<< 22736 1727204239.30635: stdout chunk (state=3): >>># /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py<<< 22736 1727204239.30665: stdout chunk (state=3): >>> <<< 22736 1727204239.30670: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc'<<< 22736 1727204239.30732: stdout chunk (state=3): >>> import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9d19c70><<< 22736 1727204239.30735: stdout chunk (state=3): >>> <<< 22736 1727204239.30757: stdout chunk (state=3): >>>import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9ceeb70> <<< 22736 1727204239.30791: stdout chunk (state=3): >>>import 'atexit' # <<< 22736 1727204239.30841: stdout chunk (state=3): >>># extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.30896: stdout chunk (state=3): >>> import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9d1a930> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.30902: stdout chunk (state=3): >>> <<< 22736 1727204239.30954: stdout chunk (state=3): >>># extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204239.30968: stdout chunk (state=3): >>>import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9d1ab70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 22736 1727204239.31078: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc'<<< 22736 1727204239.31099: stdout chunk (state=3): >>> import '_locale' # <<< 22736 1727204239.31434: stdout chunk (state=3): >>>import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9d1b0b0><<< 22736 1727204239.31439: stdout chunk (state=3): >>> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b7ce90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9b7eab0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 22736 1727204239.31469: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc'<<< 22736 1727204239.31538: stdout chunk (state=3): >>> import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b7f3b0> <<< 22736 1727204239.31579: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 22736 1727204239.31718: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b80590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py<<< 22736 1727204239.31776: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc'<<< 22736 1727204239.31802: stdout chunk (state=3): >>> <<< 22736 1727204239.31847: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py <<< 22736 1727204239.31860: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 22736 1727204239.32031: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b83080> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204239.32057: stdout chunk (state=3): >>># extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9b831d0><<< 22736 1727204239.32102: stdout chunk (state=3): >>> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b81340><<< 22736 1727204239.32129: stdout chunk (state=3): >>> <<< 22736 1727204239.32155: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py <<< 22736 1727204239.32323: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc'<<< 22736 1727204239.32361: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py <<< 22736 1727204239.32385: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc'<<< 22736 1727204239.32411: stdout chunk (state=3): >>> import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b86ff0><<< 22736 1727204239.32433: stdout chunk (state=3): >>> import '_tokenize' # <<< 22736 1727204239.32534: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b85ac0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b85820><<< 22736 1727204239.32571: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py <<< 22736 1727204239.32734: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc'<<< 22736 1727204239.32757: stdout chunk (state=3): >>> import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b87ef0> <<< 22736 1727204239.32794: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b81850><<< 22736 1727204239.32829: stdout chunk (state=3): >>> <<< 22736 1727204239.32893: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204239.32930: stdout chunk (state=3): >>>import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bcf110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py<<< 22736 1727204239.32973: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bcf290> <<< 22736 1727204239.33054: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' <<< 22736 1727204239.33095: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py<<< 22736 1727204239.33111: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc'<<< 22736 1727204239.33163: stdout chunk (state=3): >>> # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.33198: stdout chunk (state=3): >>> # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bd0e60><<< 22736 1727204239.33227: stdout chunk (state=3): >>> <<< 22736 1727204239.33244: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bd0c20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py<<< 22736 1727204239.33480: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 22736 1727204239.33509: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.33539: stdout chunk (state=3): >>> # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.33542: stdout chunk (state=3): >>> import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bd33b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bd1550><<< 22736 1727204239.33575: stdout chunk (state=3): >>> <<< 22736 1727204239.33629: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py<<< 22736 1727204239.33632: stdout chunk (state=3): >>> <<< 22736 1727204239.33729: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py<<< 22736 1727204239.33765: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc'<<< 22736 1727204239.33786: stdout chunk (state=3): >>> import '_string' # <<< 22736 1727204239.34083: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bdaba0> <<< 22736 1727204239.34160: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bd3530> <<< 22736 1727204239.34312: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204239.34317: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204239.34341: stdout chunk (state=3): >>>import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bdbe60> <<< 22736 1727204239.34396: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.34406: stdout chunk (state=3): >>> # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204239.34433: stdout chunk (state=3): >>>import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bdba10> <<< 22736 1727204239.34523: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204239.34527: stdout chunk (state=3): >>># extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204239.34572: stdout chunk (state=3): >>>import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bdbf20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bcf590> <<< 22736 1727204239.34623: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py<<< 22736 1727204239.34634: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 22736 1727204239.34678: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py <<< 22736 1727204239.34725: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 22736 1727204239.34843: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.34854: stdout chunk (state=3): >>> import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bdf680> <<< 22736 1727204239.35183: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.35511: stdout chunk (state=3): >>> # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9be0650> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bdddf0><<< 22736 1727204239.35559: stdout chunk (state=3): >>> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bdf170> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bdd9d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 22736 1727204239.35615: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.35795: stdout chunk (state=3): >>> # zipimport: zlib available<<< 22736 1727204239.35834: stdout chunk (state=3): >>> # zipimport: zlib available <<< 22736 1727204239.35892: stdout chunk (state=3): >>>import 'ansible.module_utils.common' # <<< 22736 1727204239.35918: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204239.35964: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # <<< 22736 1727204239.36095: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204239.36255: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204239.36519: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204239.37720: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.37977: stdout chunk (state=3): >>> <<< 22736 1727204239.38947: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # <<< 22736 1727204239.39027: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # <<< 22736 1727204239.39201: stdout chunk (state=3): >>> import 'ansible.module_utils.common.text.converters' # <<< 22736 1727204239.39233: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 22736 1727204239.39259: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9c68830> <<< 22736 1727204239.39445: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py<<< 22736 1727204239.39475: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 22736 1727204239.39497: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9c69580> <<< 22736 1727204239.39588: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b85a30> <<< 22736 1727204239.40021: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 22736 1727204239.40059: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available <<< 22736 1727204239.40278: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204239.40386: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 22736 1727204239.40472: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9c69340> # zipimport: zlib available <<< 22736 1727204239.41473: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.41694: stdout chunk (state=3): >>> <<< 22736 1727204239.42424: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.42471: stdout chunk (state=3): >>> <<< 22736 1727204239.42580: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.42605: stdout chunk (state=3): >>> <<< 22736 1727204239.42797: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 22736 1727204239.42869: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.42893: stdout chunk (state=3): >>> <<< 22736 1727204239.43044: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 22736 1727204239.43145: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.43149: stdout chunk (state=3): >>> <<< 22736 1727204239.43319: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 22736 1727204239.43378: stdout chunk (state=3): >>> <<< 22736 1727204239.43396: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 22736 1727204239.43448: stdout chunk (state=3): >>> # zipimport: zlib available <<< 22736 1727204239.43494: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.43572: stdout chunk (state=3): >>> import 'ansible.module_utils.parsing.convert_bool' # <<< 22736 1727204239.43606: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204239.44097: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.44167: stdout chunk (state=3): >>> <<< 22736 1727204239.44721: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py<<< 22736 1727204239.44724: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 22736 1727204239.44758: stdout chunk (state=3): >>>import '_ast' # <<< 22736 1727204239.44769: stdout chunk (state=3): >>> <<< 22736 1727204239.44927: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9c6bf50> <<< 22736 1727204239.44963: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.44974: stdout chunk (state=3): >>> <<< 22736 1727204239.45195: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204239.45256: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # <<< 22736 1727204239.45286: stdout chunk (state=3): >>>import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # <<< 22736 1727204239.45371: stdout chunk (state=3): >>>import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc'<<< 22736 1727204239.45494: stdout chunk (state=3): >>> # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204239.45702: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204239.45720: stdout chunk (state=3): >>>import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9a75f10> <<< 22736 1727204239.45782: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so'<<< 22736 1727204239.45841: stdout chunk (state=3): >>> # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9a76840><<< 22736 1727204239.45861: stdout chunk (state=3): >>> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9c6b290> <<< 22736 1727204239.45888: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.45969: stdout chunk (state=3): >>> # zipimport: zlib available <<< 22736 1727204239.46053: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # <<< 22736 1727204239.46076: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204239.46161: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204239.46245: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.46360: stdout chunk (state=3): >>> # zipimport: zlib available<<< 22736 1727204239.46375: stdout chunk (state=3): >>> <<< 22736 1727204239.46495: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 22736 1727204239.46584: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204239.46754: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' <<< 22736 1727204239.46772: stdout chunk (state=3): >>># extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9a756d0> <<< 22736 1727204239.46850: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9a76a80><<< 22736 1727204239.46922: stdout chunk (state=3): >>> import 'ansible.module_utils.common.file' # <<< 22736 1727204239.46950: stdout chunk (state=3): >>>import 'ansible.module_utils.common.process' # <<< 22736 1727204239.47088: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available<<< 22736 1727204239.47102: stdout chunk (state=3): >>> <<< 22736 1727204239.47211: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204239.47361: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' <<< 22736 1727204239.47391: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py<<< 22736 1727204239.47414: stdout chunk (state=3): >>> <<< 22736 1727204239.47454: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc'<<< 22736 1727204239.47460: stdout chunk (state=3): >>> <<< 22736 1727204239.47493: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 22736 1727204239.47628: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 22736 1727204239.47744: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc'<<< 22736 1727204239.48195: stdout chunk (state=3): >>> import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b06d50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9a80b30> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9a7eb70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9a7e9c0> # destroy ansible.module_utils.distro<<< 22736 1727204239.48198: stdout chunk (state=3): >>> import 'ansible.module_utils.distro' # # zipimport: zlib available<<< 22736 1727204239.48201: stdout chunk (state=3): >>> # zipimport: zlib available<<< 22736 1727204239.48203: stdout chunk (state=3): >>> <<< 22736 1727204239.48205: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # <<< 22736 1727204239.48207: stdout chunk (state=3): >>> import 'ansible.module_utils.common.sys_info' # <<< 22736 1727204239.48301: stdout chunk (state=3): >>> import 'ansible.module_utils.basic' # <<< 22736 1727204239.48521: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available <<< 22736 1727204239.48682: stdout chunk (state=3): >>># zipimport: zlib available<<< 22736 1727204239.48702: stdout chunk (state=3): >>> <<< 22736 1727204239.49174: stdout chunk (state=3): >>># zipimport: zlib available <<< 22736 1727204239.49244: stdout chunk (state=3): >>> <<< 22736 1727204239.49268: stdout chunk (state=3): >>>{"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} <<< 22736 1727204239.49290: stdout chunk (state=3): >>># destroy __main__<<< 22736 1727204239.49332: stdout chunk (state=3): >>> <<< 22736 1727204239.49802: stdout chunk (state=3): >>># clear sys.path_importer_cache<<< 22736 1727204239.49806: stdout chunk (state=3): >>> <<< 22736 1727204239.49844: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ <<< 22736 1727204239.49848: stdout chunk (state=3): >>># clear sys.path <<< 22736 1727204239.49897: stdout chunk (state=3): >>># clear sys.argv <<< 22736 1727204239.49900: stdout chunk (state=3): >>># clear sys.ps1<<< 22736 1727204239.49903: stdout chunk (state=3): >>> # clear sys.ps2 <<< 22736 1727204239.49930: stdout chunk (state=3): >>># clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin <<< 22736 1727204239.49945: stdout chunk (state=3): >>># restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix <<< 22736 1727204239.49980: stdout chunk (state=3): >>># cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat<<< 22736 1727204239.50014: stdout chunk (state=3): >>> # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator<<< 22736 1727204239.50057: stdout chunk (state=3): >>> # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants<<< 22736 1727204239.50105: stdout chunk (state=3): >>> # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 <<< 22736 1727204239.50129: stdout chunk (state=3): >>># cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2<<< 22736 1727204239.50154: stdout chunk (state=3): >>> # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset<<< 22736 1727204239.50173: stdout chunk (state=3): >>> # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path<<< 22736 1727204239.50333: stdout chunk (state=3): >>> # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_da<<< 22736 1727204239.50353: stdout chunk (state=3): >>>ta4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 22736 1727204239.50775: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 22736 1727204239.50778: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress <<< 22736 1727204239.50799: stdout chunk (state=3): >>># destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib <<< 22736 1727204239.51079: stdout chunk (state=3): >>># destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux<<< 22736 1727204239.51135: stdout chunk (state=3): >>> # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys <<< 22736 1727204239.51156: stdout chunk (state=3): >>># cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 22736 1727204239.51344: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket <<< 22736 1727204239.51357: stdout chunk (state=3): >>># destroy _collections <<< 22736 1727204239.51416: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath <<< 22736 1727204239.51448: stdout chunk (state=3): >>># destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib <<< 22736 1727204239.51507: stdout chunk (state=3): >>># destroy _typing <<< 22736 1727204239.51548: stdout chunk (state=3): >>># destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal <<< 22736 1727204239.51679: stdout chunk (state=3): >>># clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 22736 1727204239.51776: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref <<< 22736 1727204239.51819: stdout chunk (state=3): >>># destroy _operator # destroy _sha2 # destroy _string # destroy re <<< 22736 1727204239.51870: stdout chunk (state=3): >>># destroy itertools <<< 22736 1727204239.51878: stdout chunk (state=3): >>># destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins <<< 22736 1727204239.51906: stdout chunk (state=3): >>># destroy _thread # clear sys.audit hooks <<< 22736 1727204239.52445: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204239.52505: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 22736 1727204239.52565: stderr chunk (state=3): >>><<< 22736 1727204239.52583: stdout chunk (state=3): >>><<< 22736 1727204239.52793: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa12c4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa0fbad0> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa12ea20> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/local/lib/python3.12/site-packages' Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f410a0> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f41fd0> import 'site' # Python 3.12.6 (main, Sep 9 2024, 00:00:00) [GCC 13.3.1 20240522 (Red Hat 13.3.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f7fec0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f7ff80> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9fb78c0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9fb7f50> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f97b60> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f952b0> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f7d070> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9fdb890> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9fda4b0> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f962a0> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9fd8bc0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa00c800> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f7c2f0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13aa00ccb0> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa00cb60> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13aa00cf50> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9f7ae10> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa00d610> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa00d2e0> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa00e510> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa028740> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13aa029e80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa02ad80> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13aa02b3e0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa02a2d0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13aa02be30> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa02b560> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa00e570> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9debd40> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9e14860> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e145c0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9e14890> # extension module '_sha2' loaded from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' # extension module '_sha2' executed from '/usr/lib64/python3.12/lib-dynload/_sha2.cpython-312-x86_64-linux-gnu.so' import '_sha2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9e14a70> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9de9ee0> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e16180> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e14e00> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13aa00ec60> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e3e510> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e5a690> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e8f410> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9eb9bb0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e8f530> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e5b320> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9c944a0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e596d0> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9e170b0> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f13a9e597f0> # zipimport: found 30 names in '/tmp/ansible_stat_payload_3qo0iess/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9cee0c0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9cc5040> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9cc41a0> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9cc7fe0> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9d19b80> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9d19910> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9d19220> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9d19c70> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9ceeb70> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9d1a930> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9d1ab70> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9d1b0b0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b7ce90> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9b7eab0> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b7f3b0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b80590> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b83080> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9b831d0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b81340> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b86ff0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b85ac0> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b85820> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b87ef0> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b81850> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bcf110> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bcf290> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bd0e60> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bd0c20> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bd33b0> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bd1550> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bdaba0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bd3530> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bdbe60> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bdba10> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bdbf20> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bcf590> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bdf680> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9be0650> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bdddf0> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9bdf170> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9bdd9d0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9c68830> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9c69580> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b85a30> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9c69340> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9c6bf50> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9a75f10> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9a76840> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9c6b290> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f13a9a756d0> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9a76a80> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9b06d50> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9a80b30> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9a7eb70> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f13a9a7e9c0> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _sha2 # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy _hashlib # destroy _blake2 # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _sha2 # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _operator # destroy _sha2 # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 22736 1727204239.53559: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204238.6487117-22914-55779580031446/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204239.53562: _low_level_execute_command(): starting 22736 1727204239.53565: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204238.6487117-22914-55779580031446/ > /dev/null 2>&1 && sleep 0' 22736 1727204239.53884: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204239.53971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204239.54028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204239.54045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204239.54078: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204239.54146: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22736 1727204239.57019: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204239.57066: stderr chunk (state=3): >>><<< 22736 1727204239.57076: stdout chunk (state=3): >>><<< 22736 1727204239.57103: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22736 1727204239.57138: handler run complete 22736 1727204239.57167: attempt loop complete, returning result 22736 1727204239.57294: _execute() done 22736 1727204239.57298: dumping result to json 22736 1727204239.57300: done dumping result, returning 22736 1727204239.57302: done running TaskExecutor() for managed-node2/TASK: Check if system is ostree [12b410aa-8751-4f4a-548a-00000000008f] 22736 1727204239.57304: sending task result for task 12b410aa-8751-4f4a-548a-00000000008f 22736 1727204239.57381: done sending task result for task 12b410aa-8751-4f4a-548a-00000000008f 22736 1727204239.57384: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 22736 1727204239.57468: no more pending results, returning what we have 22736 1727204239.57471: results queue empty 22736 1727204239.57472: checking for any_errors_fatal 22736 1727204239.57482: done checking for any_errors_fatal 22736 1727204239.57483: checking for max_fail_percentage 22736 1727204239.57485: done checking for max_fail_percentage 22736 1727204239.57486: checking to see if all hosts have failed and the running result is not ok 22736 1727204239.57487: done checking to see if all hosts have failed 22736 1727204239.57488: getting the remaining hosts for this loop 22736 1727204239.57696: done getting the remaining hosts for this loop 22736 1727204239.57702: getting the next task for host managed-node2 22736 1727204239.57710: done getting next task for host managed-node2 22736 1727204239.57716: ^ task is: TASK: Set flag to indicate system is ostree 22736 1727204239.57719: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204239.57723: getting variables 22736 1727204239.57725: in VariableManager get_vars() 22736 1727204239.57761: Calling all_inventory to load vars for managed-node2 22736 1727204239.57765: Calling groups_inventory to load vars for managed-node2 22736 1727204239.57769: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204239.57783: Calling all_plugins_play to load vars for managed-node2 22736 1727204239.57787: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204239.57800: Calling groups_plugins_play to load vars for managed-node2 22736 1727204239.58253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204239.58597: done with get_vars() 22736 1727204239.58610: done getting variables 22736 1727204239.58733: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Tuesday 24 September 2024 14:57:19 -0400 (0:00:01.020) 0:00:04.372 ***** 22736 1727204239.58768: entering _queue_task() for managed-node2/set_fact 22736 1727204239.58770: Creating lock for set_fact 22736 1727204239.59247: worker is 1 (out of 1 available) 22736 1727204239.59260: exiting _queue_task() for managed-node2/set_fact 22736 1727204239.59273: done queuing things up, now waiting for results queue to drain 22736 1727204239.59274: waiting for pending results... 22736 1727204239.59570: running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree 22736 1727204239.59575: in run() - task 12b410aa-8751-4f4a-548a-000000000090 22736 1727204239.59598: variable 'ansible_search_path' from source: unknown 22736 1727204239.59616: variable 'ansible_search_path' from source: unknown 22736 1727204239.59660: calling self._execute() 22736 1727204239.59757: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204239.59776: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204239.59796: variable 'omit' from source: magic vars 22736 1727204239.60412: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204239.60746: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204239.60830: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204239.60888: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204239.60983: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204239.61128: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204239.61180: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204239.61232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204239.61308: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204239.61464: Evaluated conditional (not __network_is_ostree is defined): True 22736 1727204239.61482: variable 'omit' from source: magic vars 22736 1727204239.61550: variable 'omit' from source: magic vars 22736 1727204239.61793: variable '__ostree_booted_stat' from source: set_fact 22736 1727204239.61821: variable 'omit' from source: magic vars 22736 1727204239.61864: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204239.61917: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204239.61946: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204239.61981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204239.62007: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204239.62074: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204239.62077: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204239.62080: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204239.62210: Set connection var ansible_timeout to 10 22736 1727204239.62243: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204239.62292: Set connection var ansible_shell_executable to /bin/sh 22736 1727204239.62297: Set connection var ansible_shell_type to sh 22736 1727204239.62300: Set connection var ansible_pipelining to False 22736 1727204239.62303: Set connection var ansible_connection to ssh 22736 1727204239.62322: variable 'ansible_shell_executable' from source: unknown 22736 1727204239.62336: variable 'ansible_connection' from source: unknown 22736 1727204239.62349: variable 'ansible_module_compression' from source: unknown 22736 1727204239.62396: variable 'ansible_shell_type' from source: unknown 22736 1727204239.62399: variable 'ansible_shell_executable' from source: unknown 22736 1727204239.62401: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204239.62403: variable 'ansible_pipelining' from source: unknown 22736 1727204239.62405: variable 'ansible_timeout' from source: unknown 22736 1727204239.62408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204239.62534: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204239.62561: variable 'omit' from source: magic vars 22736 1727204239.62571: starting attempt loop 22736 1727204239.62578: running the handler 22736 1727204239.62617: handler run complete 22736 1727204239.62620: attempt loop complete, returning result 22736 1727204239.62624: _execute() done 22736 1727204239.62658: dumping result to json 22736 1727204239.62660: done dumping result, returning 22736 1727204239.62663: done running TaskExecutor() for managed-node2/TASK: Set flag to indicate system is ostree [12b410aa-8751-4f4a-548a-000000000090] 22736 1727204239.62670: sending task result for task 12b410aa-8751-4f4a-548a-000000000090 22736 1727204239.62943: done sending task result for task 12b410aa-8751-4f4a-548a-000000000090 22736 1727204239.62947: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 22736 1727204239.63017: no more pending results, returning what we have 22736 1727204239.63021: results queue empty 22736 1727204239.63022: checking for any_errors_fatal 22736 1727204239.63031: done checking for any_errors_fatal 22736 1727204239.63032: checking for max_fail_percentage 22736 1727204239.63034: done checking for max_fail_percentage 22736 1727204239.63035: checking to see if all hosts have failed and the running result is not ok 22736 1727204239.63036: done checking to see if all hosts have failed 22736 1727204239.63037: getting the remaining hosts for this loop 22736 1727204239.63038: done getting the remaining hosts for this loop 22736 1727204239.63043: getting the next task for host managed-node2 22736 1727204239.63053: done getting next task for host managed-node2 22736 1727204239.63063: ^ task is: TASK: Fix CentOS6 Base repo 22736 1727204239.63066: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204239.63071: getting variables 22736 1727204239.63073: in VariableManager get_vars() 22736 1727204239.63185: Calling all_inventory to load vars for managed-node2 22736 1727204239.63188: Calling groups_inventory to load vars for managed-node2 22736 1727204239.63196: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204239.63210: Calling all_plugins_play to load vars for managed-node2 22736 1727204239.63215: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204239.63226: Calling groups_plugins_play to load vars for managed-node2 22736 1727204239.63519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204239.63828: done with get_vars() 22736 1727204239.63840: done getting variables 22736 1727204239.63984: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.052) 0:00:04.425 ***** 22736 1727204239.64026: entering _queue_task() for managed-node2/copy 22736 1727204239.64345: worker is 1 (out of 1 available) 22736 1727204239.64358: exiting _queue_task() for managed-node2/copy 22736 1727204239.64371: done queuing things up, now waiting for results queue to drain 22736 1727204239.64372: waiting for pending results... 22736 1727204239.64669: running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo 22736 1727204239.64729: in run() - task 12b410aa-8751-4f4a-548a-000000000092 22736 1727204239.64751: variable 'ansible_search_path' from source: unknown 22736 1727204239.64768: variable 'ansible_search_path' from source: unknown 22736 1727204239.64811: calling self._execute() 22736 1727204239.64987: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204239.64993: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204239.64996: variable 'omit' from source: magic vars 22736 1727204239.65588: variable 'ansible_distribution' from source: facts 22736 1727204239.65627: Evaluated conditional (ansible_distribution == 'CentOS'): False 22736 1727204239.65647: when evaluation is False, skipping this task 22736 1727204239.65659: _execute() done 22736 1727204239.65668: dumping result to json 22736 1727204239.65676: done dumping result, returning 22736 1727204239.65687: done running TaskExecutor() for managed-node2/TASK: Fix CentOS6 Base repo [12b410aa-8751-4f4a-548a-000000000092] 22736 1727204239.65708: sending task result for task 12b410aa-8751-4f4a-548a-000000000092 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution == 'CentOS'", "skip_reason": "Conditional result was False" } 22736 1727204239.65970: no more pending results, returning what we have 22736 1727204239.65974: results queue empty 22736 1727204239.65975: checking for any_errors_fatal 22736 1727204239.65981: done checking for any_errors_fatal 22736 1727204239.65982: checking for max_fail_percentage 22736 1727204239.65984: done checking for max_fail_percentage 22736 1727204239.65985: checking to see if all hosts have failed and the running result is not ok 22736 1727204239.65986: done checking to see if all hosts have failed 22736 1727204239.65987: getting the remaining hosts for this loop 22736 1727204239.65991: done getting the remaining hosts for this loop 22736 1727204239.66194: getting the next task for host managed-node2 22736 1727204239.66202: done getting next task for host managed-node2 22736 1727204239.66205: ^ task is: TASK: Include the task 'enable_epel.yml' 22736 1727204239.66209: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204239.66215: getting variables 22736 1727204239.66217: in VariableManager get_vars() 22736 1727204239.66244: Calling all_inventory to load vars for managed-node2 22736 1727204239.66247: Calling groups_inventory to load vars for managed-node2 22736 1727204239.66251: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204239.66262: Calling all_plugins_play to load vars for managed-node2 22736 1727204239.66266: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204239.66270: Calling groups_plugins_play to load vars for managed-node2 22736 1727204239.66547: done sending task result for task 12b410aa-8751-4f4a-548a-000000000092 22736 1727204239.66551: WORKER PROCESS EXITING 22736 1727204239.66579: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204239.66881: done with get_vars() 22736 1727204239.66894: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.029) 0:00:04.455 ***** 22736 1727204239.67014: entering _queue_task() for managed-node2/include_tasks 22736 1727204239.67403: worker is 1 (out of 1 available) 22736 1727204239.67418: exiting _queue_task() for managed-node2/include_tasks 22736 1727204239.67429: done queuing things up, now waiting for results queue to drain 22736 1727204239.67430: waiting for pending results... 22736 1727204239.67598: running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' 22736 1727204239.67738: in run() - task 12b410aa-8751-4f4a-548a-000000000093 22736 1727204239.67760: variable 'ansible_search_path' from source: unknown 22736 1727204239.67771: variable 'ansible_search_path' from source: unknown 22736 1727204239.67819: calling self._execute() 22736 1727204239.67917: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204239.67936: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204239.67959: variable 'omit' from source: magic vars 22736 1727204239.68620: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204239.71525: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204239.71565: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204239.71618: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204239.71688: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204239.71730: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204239.71851: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204239.71905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204239.71948: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204239.72022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204239.72069: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204239.72211: variable '__network_is_ostree' from source: set_fact 22736 1727204239.72239: Evaluated conditional (not __network_is_ostree | d(false)): True 22736 1727204239.72288: _execute() done 22736 1727204239.72291: dumping result to json 22736 1727204239.72295: done dumping result, returning 22736 1727204239.72298: done running TaskExecutor() for managed-node2/TASK: Include the task 'enable_epel.yml' [12b410aa-8751-4f4a-548a-000000000093] 22736 1727204239.72301: sending task result for task 12b410aa-8751-4f4a-548a-000000000093 22736 1727204239.72472: done sending task result for task 12b410aa-8751-4f4a-548a-000000000093 22736 1727204239.72476: WORKER PROCESS EXITING 22736 1727204239.72536: no more pending results, returning what we have 22736 1727204239.72542: in VariableManager get_vars() 22736 1727204239.72580: Calling all_inventory to load vars for managed-node2 22736 1727204239.72584: Calling groups_inventory to load vars for managed-node2 22736 1727204239.72591: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204239.72606: Calling all_plugins_play to load vars for managed-node2 22736 1727204239.72611: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204239.72618: Calling groups_plugins_play to load vars for managed-node2 22736 1727204239.73072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204239.73374: done with get_vars() 22736 1727204239.73385: variable 'ansible_search_path' from source: unknown 22736 1727204239.73387: variable 'ansible_search_path' from source: unknown 22736 1727204239.73436: we have included files to process 22736 1727204239.73438: generating all_blocks data 22736 1727204239.73440: done generating all_blocks data 22736 1727204239.73453: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 22736 1727204239.73455: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 22736 1727204239.73459: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 22736 1727204239.74452: done processing included file 22736 1727204239.74455: iterating over new_blocks loaded from include file 22736 1727204239.74457: in VariableManager get_vars() 22736 1727204239.74471: done with get_vars() 22736 1727204239.74472: filtering new block on tags 22736 1727204239.74505: done filtering new block on tags 22736 1727204239.74509: in VariableManager get_vars() 22736 1727204239.74525: done with get_vars() 22736 1727204239.74527: filtering new block on tags 22736 1727204239.74550: done filtering new block on tags 22736 1727204239.74553: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed-node2 22736 1727204239.74559: extending task lists for all hosts with included blocks 22736 1727204239.74709: done extending task lists 22736 1727204239.74711: done processing included files 22736 1727204239.74714: results queue empty 22736 1727204239.74715: checking for any_errors_fatal 22736 1727204239.74719: done checking for any_errors_fatal 22736 1727204239.74720: checking for max_fail_percentage 22736 1727204239.74721: done checking for max_fail_percentage 22736 1727204239.74722: checking to see if all hosts have failed and the running result is not ok 22736 1727204239.74722: done checking to see if all hosts have failed 22736 1727204239.74723: getting the remaining hosts for this loop 22736 1727204239.74724: done getting the remaining hosts for this loop 22736 1727204239.74727: getting the next task for host managed-node2 22736 1727204239.74732: done getting next task for host managed-node2 22736 1727204239.74734: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 22736 1727204239.74737: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204239.74739: getting variables 22736 1727204239.74740: in VariableManager get_vars() 22736 1727204239.74749: Calling all_inventory to load vars for managed-node2 22736 1727204239.74751: Calling groups_inventory to load vars for managed-node2 22736 1727204239.74754: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204239.74768: Calling all_plugins_play to load vars for managed-node2 22736 1727204239.74776: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204239.74780: Calling groups_plugins_play to load vars for managed-node2 22736 1727204239.75009: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204239.75305: done with get_vars() 22736 1727204239.75321: done getting variables 22736 1727204239.75400: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 22736 1727204239.75665: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 39] ********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.087) 0:00:04.542 ***** 22736 1727204239.75723: entering _queue_task() for managed-node2/command 22736 1727204239.75726: Creating lock for command 22736 1727204239.76112: worker is 1 (out of 1 available) 22736 1727204239.76128: exiting _queue_task() for managed-node2/command 22736 1727204239.76141: done queuing things up, now waiting for results queue to drain 22736 1727204239.76142: waiting for pending results... 22736 1727204239.76508: running TaskExecutor() for managed-node2/TASK: Create EPEL 39 22736 1727204239.76546: in run() - task 12b410aa-8751-4f4a-548a-0000000000ad 22736 1727204239.76595: variable 'ansible_search_path' from source: unknown 22736 1727204239.76598: variable 'ansible_search_path' from source: unknown 22736 1727204239.76623: calling self._execute() 22736 1727204239.76715: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204239.76842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204239.76847: variable 'omit' from source: magic vars 22736 1727204239.77234: variable 'ansible_distribution' from source: facts 22736 1727204239.77252: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22736 1727204239.77260: when evaluation is False, skipping this task 22736 1727204239.77267: _execute() done 22736 1727204239.77286: dumping result to json 22736 1727204239.77297: done dumping result, returning 22736 1727204239.77308: done running TaskExecutor() for managed-node2/TASK: Create EPEL 39 [12b410aa-8751-4f4a-548a-0000000000ad] 22736 1727204239.77320: sending task result for task 12b410aa-8751-4f4a-548a-0000000000ad skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22736 1727204239.77627: no more pending results, returning what we have 22736 1727204239.77631: results queue empty 22736 1727204239.77632: checking for any_errors_fatal 22736 1727204239.77634: done checking for any_errors_fatal 22736 1727204239.77635: checking for max_fail_percentage 22736 1727204239.77637: done checking for max_fail_percentage 22736 1727204239.77638: checking to see if all hosts have failed and the running result is not ok 22736 1727204239.77639: done checking to see if all hosts have failed 22736 1727204239.77640: getting the remaining hosts for this loop 22736 1727204239.77641: done getting the remaining hosts for this loop 22736 1727204239.77646: getting the next task for host managed-node2 22736 1727204239.77654: done getting next task for host managed-node2 22736 1727204239.77657: ^ task is: TASK: Install yum-utils package 22736 1727204239.77662: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204239.77781: getting variables 22736 1727204239.77784: in VariableManager get_vars() 22736 1727204239.77814: Calling all_inventory to load vars for managed-node2 22736 1727204239.77818: Calling groups_inventory to load vars for managed-node2 22736 1727204239.77821: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204239.77832: Calling all_plugins_play to load vars for managed-node2 22736 1727204239.77836: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204239.77840: Calling groups_plugins_play to load vars for managed-node2 22736 1727204239.78099: done sending task result for task 12b410aa-8751-4f4a-548a-0000000000ad 22736 1727204239.78104: WORKER PROCESS EXITING 22736 1727204239.78139: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204239.78476: done with get_vars() 22736 1727204239.78487: done getting variables 22736 1727204239.78607: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.029) 0:00:04.571 ***** 22736 1727204239.78643: entering _queue_task() for managed-node2/package 22736 1727204239.78645: Creating lock for package 22736 1727204239.78938: worker is 1 (out of 1 available) 22736 1727204239.78951: exiting _queue_task() for managed-node2/package 22736 1727204239.78963: done queuing things up, now waiting for results queue to drain 22736 1727204239.78965: waiting for pending results... 22736 1727204239.79247: running TaskExecutor() for managed-node2/TASK: Install yum-utils package 22736 1727204239.79392: in run() - task 12b410aa-8751-4f4a-548a-0000000000ae 22736 1727204239.79432: variable 'ansible_search_path' from source: unknown 22736 1727204239.79436: variable 'ansible_search_path' from source: unknown 22736 1727204239.79542: calling self._execute() 22736 1727204239.79574: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204239.79588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204239.79607: variable 'omit' from source: magic vars 22736 1727204239.80070: variable 'ansible_distribution' from source: facts 22736 1727204239.80106: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22736 1727204239.80118: when evaluation is False, skipping this task 22736 1727204239.80127: _execute() done 22736 1727204239.80196: dumping result to json 22736 1727204239.80203: done dumping result, returning 22736 1727204239.80207: done running TaskExecutor() for managed-node2/TASK: Install yum-utils package [12b410aa-8751-4f4a-548a-0000000000ae] 22736 1727204239.80209: sending task result for task 12b410aa-8751-4f4a-548a-0000000000ae skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22736 1727204239.80351: no more pending results, returning what we have 22736 1727204239.80355: results queue empty 22736 1727204239.80356: checking for any_errors_fatal 22736 1727204239.80366: done checking for any_errors_fatal 22736 1727204239.80367: checking for max_fail_percentage 22736 1727204239.80369: done checking for max_fail_percentage 22736 1727204239.80370: checking to see if all hosts have failed and the running result is not ok 22736 1727204239.80371: done checking to see if all hosts have failed 22736 1727204239.80372: getting the remaining hosts for this loop 22736 1727204239.80374: done getting the remaining hosts for this loop 22736 1727204239.80379: getting the next task for host managed-node2 22736 1727204239.80388: done getting next task for host managed-node2 22736 1727204239.80501: ^ task is: TASK: Enable EPEL 7 22736 1727204239.80508: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204239.80514: getting variables 22736 1727204239.80517: in VariableManager get_vars() 22736 1727204239.80551: Calling all_inventory to load vars for managed-node2 22736 1727204239.80554: Calling groups_inventory to load vars for managed-node2 22736 1727204239.80559: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204239.80575: Calling all_plugins_play to load vars for managed-node2 22736 1727204239.80580: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204239.80585: Calling groups_plugins_play to load vars for managed-node2 22736 1727204239.80711: done sending task result for task 12b410aa-8751-4f4a-548a-0000000000ae 22736 1727204239.80717: WORKER PROCESS EXITING 22736 1727204239.81017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204239.81322: done with get_vars() 22736 1727204239.81333: done getting variables 22736 1727204239.81416: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.028) 0:00:04.599 ***** 22736 1727204239.81451: entering _queue_task() for managed-node2/command 22736 1727204239.81755: worker is 1 (out of 1 available) 22736 1727204239.81768: exiting _queue_task() for managed-node2/command 22736 1727204239.81781: done queuing things up, now waiting for results queue to drain 22736 1727204239.81783: waiting for pending results... 22736 1727204239.82078: running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 22736 1727204239.82236: in run() - task 12b410aa-8751-4f4a-548a-0000000000af 22736 1727204239.82266: variable 'ansible_search_path' from source: unknown 22736 1727204239.82331: variable 'ansible_search_path' from source: unknown 22736 1727204239.82337: calling self._execute() 22736 1727204239.82444: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204239.82465: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204239.82482: variable 'omit' from source: magic vars 22736 1727204239.82965: variable 'ansible_distribution' from source: facts 22736 1727204239.82995: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22736 1727204239.83009: when evaluation is False, skipping this task 22736 1727204239.83094: _execute() done 22736 1727204239.83100: dumping result to json 22736 1727204239.83102: done dumping result, returning 22736 1727204239.83105: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 7 [12b410aa-8751-4f4a-548a-0000000000af] 22736 1727204239.83108: sending task result for task 12b410aa-8751-4f4a-548a-0000000000af 22736 1727204239.83183: done sending task result for task 12b410aa-8751-4f4a-548a-0000000000af 22736 1727204239.83186: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22736 1727204239.83259: no more pending results, returning what we have 22736 1727204239.83263: results queue empty 22736 1727204239.83264: checking for any_errors_fatal 22736 1727204239.83271: done checking for any_errors_fatal 22736 1727204239.83272: checking for max_fail_percentage 22736 1727204239.83274: done checking for max_fail_percentage 22736 1727204239.83275: checking to see if all hosts have failed and the running result is not ok 22736 1727204239.83276: done checking to see if all hosts have failed 22736 1727204239.83277: getting the remaining hosts for this loop 22736 1727204239.83278: done getting the remaining hosts for this loop 22736 1727204239.83283: getting the next task for host managed-node2 22736 1727204239.83294: done getting next task for host managed-node2 22736 1727204239.83297: ^ task is: TASK: Enable EPEL 8 22736 1727204239.83304: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204239.83308: getting variables 22736 1727204239.83310: in VariableManager get_vars() 22736 1727204239.83345: Calling all_inventory to load vars for managed-node2 22736 1727204239.83348: Calling groups_inventory to load vars for managed-node2 22736 1727204239.83353: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204239.83369: Calling all_plugins_play to load vars for managed-node2 22736 1727204239.83374: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204239.83379: Calling groups_plugins_play to load vars for managed-node2 22736 1727204239.83874: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204239.84180: done with get_vars() 22736 1727204239.84193: done getting variables 22736 1727204239.84260: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.028) 0:00:04.627 ***** 22736 1727204239.84305: entering _queue_task() for managed-node2/command 22736 1727204239.84561: worker is 1 (out of 1 available) 22736 1727204239.84573: exiting _queue_task() for managed-node2/command 22736 1727204239.84586: done queuing things up, now waiting for results queue to drain 22736 1727204239.84587: waiting for pending results... 22736 1727204239.84963: running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 22736 1727204239.85155: in run() - task 12b410aa-8751-4f4a-548a-0000000000b0 22736 1727204239.85160: variable 'ansible_search_path' from source: unknown 22736 1727204239.85163: variable 'ansible_search_path' from source: unknown 22736 1727204239.85166: calling self._execute() 22736 1727204239.85237: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204239.85258: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204239.85276: variable 'omit' from source: magic vars 22736 1727204239.85742: variable 'ansible_distribution' from source: facts 22736 1727204239.85764: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22736 1727204239.85776: when evaluation is False, skipping this task 22736 1727204239.85786: _execute() done 22736 1727204239.85807: dumping result to json 22736 1727204239.85825: done dumping result, returning 22736 1727204239.85896: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 8 [12b410aa-8751-4f4a-548a-0000000000b0] 22736 1727204239.85900: sending task result for task 12b410aa-8751-4f4a-548a-0000000000b0 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22736 1727204239.86077: no more pending results, returning what we have 22736 1727204239.86081: results queue empty 22736 1727204239.86082: checking for any_errors_fatal 22736 1727204239.86088: done checking for any_errors_fatal 22736 1727204239.86091: checking for max_fail_percentage 22736 1727204239.86093: done checking for max_fail_percentage 22736 1727204239.86094: checking to see if all hosts have failed and the running result is not ok 22736 1727204239.86095: done checking to see if all hosts have failed 22736 1727204239.86096: getting the remaining hosts for this loop 22736 1727204239.86098: done getting the remaining hosts for this loop 22736 1727204239.86102: getting the next task for host managed-node2 22736 1727204239.86115: done getting next task for host managed-node2 22736 1727204239.86118: ^ task is: TASK: Enable EPEL 6 22736 1727204239.86123: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204239.86129: getting variables 22736 1727204239.86261: in VariableManager get_vars() 22736 1727204239.86287: Calling all_inventory to load vars for managed-node2 22736 1727204239.86292: Calling groups_inventory to load vars for managed-node2 22736 1727204239.86296: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204239.86302: done sending task result for task 12b410aa-8751-4f4a-548a-0000000000b0 22736 1727204239.86305: WORKER PROCESS EXITING 22736 1727204239.86317: Calling all_plugins_play to load vars for managed-node2 22736 1727204239.86321: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204239.86325: Calling groups_plugins_play to load vars for managed-node2 22736 1727204239.86561: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204239.86874: done with get_vars() 22736 1727204239.86885: done getting variables 22736 1727204239.86970: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.027) 0:00:04.655 ***** 22736 1727204239.87055: entering _queue_task() for managed-node2/copy 22736 1727204239.87559: worker is 1 (out of 1 available) 22736 1727204239.87568: exiting _queue_task() for managed-node2/copy 22736 1727204239.87577: done queuing things up, now waiting for results queue to drain 22736 1727204239.87579: waiting for pending results... 22736 1727204239.87787: running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 22736 1727204239.88095: in run() - task 12b410aa-8751-4f4a-548a-0000000000b2 22736 1727204239.88294: variable 'ansible_search_path' from source: unknown 22736 1727204239.88298: variable 'ansible_search_path' from source: unknown 22736 1727204239.88301: calling self._execute() 22736 1727204239.88304: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204239.88307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204239.88310: variable 'omit' from source: magic vars 22736 1727204239.88924: variable 'ansible_distribution' from source: facts 22736 1727204239.88942: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): False 22736 1727204239.88951: when evaluation is False, skipping this task 22736 1727204239.88959: _execute() done 22736 1727204239.88978: dumping result to json 22736 1727204239.88991: done dumping result, returning 22736 1727204239.89003: done running TaskExecutor() for managed-node2/TASK: Enable EPEL 6 [12b410aa-8751-4f4a-548a-0000000000b2] 22736 1727204239.89013: sending task result for task 12b410aa-8751-4f4a-548a-0000000000b2 skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in ['RedHat', 'CentOS']", "skip_reason": "Conditional result was False" } 22736 1727204239.89165: no more pending results, returning what we have 22736 1727204239.89170: results queue empty 22736 1727204239.89171: checking for any_errors_fatal 22736 1727204239.89176: done checking for any_errors_fatal 22736 1727204239.89177: checking for max_fail_percentage 22736 1727204239.89179: done checking for max_fail_percentage 22736 1727204239.89180: checking to see if all hosts have failed and the running result is not ok 22736 1727204239.89181: done checking to see if all hosts have failed 22736 1727204239.89182: getting the remaining hosts for this loop 22736 1727204239.89184: done getting the remaining hosts for this loop 22736 1727204239.89188: getting the next task for host managed-node2 22736 1727204239.89201: done getting next task for host managed-node2 22736 1727204239.89205: ^ task is: TASK: Set network provider to 'nm' 22736 1727204239.89208: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204239.89213: getting variables 22736 1727204239.89215: in VariableManager get_vars() 22736 1727204239.89246: Calling all_inventory to load vars for managed-node2 22736 1727204239.89249: Calling groups_inventory to load vars for managed-node2 22736 1727204239.89253: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204239.89269: Calling all_plugins_play to load vars for managed-node2 22736 1727204239.89274: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204239.89278: Calling groups_plugins_play to load vars for managed-node2 22736 1727204239.89722: done sending task result for task 12b410aa-8751-4f4a-548a-0000000000b2 22736 1727204239.89725: WORKER PROCESS EXITING 22736 1727204239.89751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204239.90026: done with get_vars() 22736 1727204239.90036: done getting variables 22736 1727204239.90099: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:13 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.030) 0:00:04.686 ***** 22736 1727204239.90128: entering _queue_task() for managed-node2/set_fact 22736 1727204239.90348: worker is 1 (out of 1 available) 22736 1727204239.90361: exiting _queue_task() for managed-node2/set_fact 22736 1727204239.90373: done queuing things up, now waiting for results queue to drain 22736 1727204239.90375: waiting for pending results... 22736 1727204239.90616: running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' 22736 1727204239.90713: in run() - task 12b410aa-8751-4f4a-548a-000000000007 22736 1727204239.90735: variable 'ansible_search_path' from source: unknown 22736 1727204239.90810: calling self._execute() 22736 1727204239.90898: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204239.90911: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204239.90932: variable 'omit' from source: magic vars 22736 1727204239.91054: variable 'omit' from source: magic vars 22736 1727204239.91096: variable 'omit' from source: magic vars 22736 1727204239.91194: variable 'omit' from source: magic vars 22736 1727204239.91198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204239.91244: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204239.91275: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204239.91303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204239.91323: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204239.91395: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204239.91399: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204239.91401: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204239.91514: Set connection var ansible_timeout to 10 22736 1727204239.91534: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204239.91551: Set connection var ansible_shell_executable to /bin/sh 22736 1727204239.91558: Set connection var ansible_shell_type to sh 22736 1727204239.91579: Set connection var ansible_pipelining to False 22736 1727204239.91582: Set connection var ansible_connection to ssh 22736 1727204239.91794: variable 'ansible_shell_executable' from source: unknown 22736 1727204239.91798: variable 'ansible_connection' from source: unknown 22736 1727204239.91800: variable 'ansible_module_compression' from source: unknown 22736 1727204239.91802: variable 'ansible_shell_type' from source: unknown 22736 1727204239.91805: variable 'ansible_shell_executable' from source: unknown 22736 1727204239.91807: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204239.91809: variable 'ansible_pipelining' from source: unknown 22736 1727204239.91811: variable 'ansible_timeout' from source: unknown 22736 1727204239.91813: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204239.91824: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204239.91842: variable 'omit' from source: magic vars 22736 1727204239.91851: starting attempt loop 22736 1727204239.91860: running the handler 22736 1727204239.91879: handler run complete 22736 1727204239.91896: attempt loop complete, returning result 22736 1727204239.91904: _execute() done 22736 1727204239.91912: dumping result to json 22736 1727204239.91919: done dumping result, returning 22736 1727204239.91935: done running TaskExecutor() for managed-node2/TASK: Set network provider to 'nm' [12b410aa-8751-4f4a-548a-000000000007] 22736 1727204239.91946: sending task result for task 12b410aa-8751-4f4a-548a-000000000007 ok: [managed-node2] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 22736 1727204239.92150: no more pending results, returning what we have 22736 1727204239.92153: results queue empty 22736 1727204239.92154: checking for any_errors_fatal 22736 1727204239.92160: done checking for any_errors_fatal 22736 1727204239.92161: checking for max_fail_percentage 22736 1727204239.92163: done checking for max_fail_percentage 22736 1727204239.92164: checking to see if all hosts have failed and the running result is not ok 22736 1727204239.92165: done checking to see if all hosts have failed 22736 1727204239.92166: getting the remaining hosts for this loop 22736 1727204239.92167: done getting the remaining hosts for this loop 22736 1727204239.92171: getting the next task for host managed-node2 22736 1727204239.92180: done getting next task for host managed-node2 22736 1727204239.92183: ^ task is: TASK: meta (flush_handlers) 22736 1727204239.92185: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204239.92193: getting variables 22736 1727204239.92195: in VariableManager get_vars() 22736 1727204239.92227: Calling all_inventory to load vars for managed-node2 22736 1727204239.92230: Calling groups_inventory to load vars for managed-node2 22736 1727204239.92234: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204239.92247: Calling all_plugins_play to load vars for managed-node2 22736 1727204239.92251: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204239.92254: Calling groups_plugins_play to load vars for managed-node2 22736 1727204239.92596: done sending task result for task 12b410aa-8751-4f4a-548a-000000000007 22736 1727204239.92600: WORKER PROCESS EXITING 22736 1727204239.92628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204239.92886: done with get_vars() 22736 1727204239.92899: done getting variables 22736 1727204239.92972: in VariableManager get_vars() 22736 1727204239.92982: Calling all_inventory to load vars for managed-node2 22736 1727204239.92985: Calling groups_inventory to load vars for managed-node2 22736 1727204239.92988: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204239.92995: Calling all_plugins_play to load vars for managed-node2 22736 1727204239.92998: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204239.93001: Calling groups_plugins_play to load vars for managed-node2 22736 1727204239.93409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204239.93641: done with get_vars() 22736 1727204239.93660: done queuing things up, now waiting for results queue to drain 22736 1727204239.93662: results queue empty 22736 1727204239.93663: checking for any_errors_fatal 22736 1727204239.93665: done checking for any_errors_fatal 22736 1727204239.93666: checking for max_fail_percentage 22736 1727204239.93667: done checking for max_fail_percentage 22736 1727204239.93668: checking to see if all hosts have failed and the running result is not ok 22736 1727204239.93669: done checking to see if all hosts have failed 22736 1727204239.93670: getting the remaining hosts for this loop 22736 1727204239.93671: done getting the remaining hosts for this loop 22736 1727204239.93673: getting the next task for host managed-node2 22736 1727204239.93677: done getting next task for host managed-node2 22736 1727204239.93679: ^ task is: TASK: meta (flush_handlers) 22736 1727204239.93680: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204239.93687: getting variables 22736 1727204239.93688: in VariableManager get_vars() 22736 1727204239.93698: Calling all_inventory to load vars for managed-node2 22736 1727204239.93701: Calling groups_inventory to load vars for managed-node2 22736 1727204239.93703: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204239.93708: Calling all_plugins_play to load vars for managed-node2 22736 1727204239.93710: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204239.93714: Calling groups_plugins_play to load vars for managed-node2 22736 1727204239.93888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204239.94337: done with get_vars() 22736 1727204239.94348: done getting variables 22736 1727204239.94408: in VariableManager get_vars() 22736 1727204239.94418: Calling all_inventory to load vars for managed-node2 22736 1727204239.94421: Calling groups_inventory to load vars for managed-node2 22736 1727204239.94424: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204239.94429: Calling all_plugins_play to load vars for managed-node2 22736 1727204239.94432: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204239.94436: Calling groups_plugins_play to load vars for managed-node2 22736 1727204239.94851: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204239.95526: done with get_vars() 22736 1727204239.95540: done queuing things up, now waiting for results queue to drain 22736 1727204239.95542: results queue empty 22736 1727204239.95543: checking for any_errors_fatal 22736 1727204239.95546: done checking for any_errors_fatal 22736 1727204239.95547: checking for max_fail_percentage 22736 1727204239.95548: done checking for max_fail_percentage 22736 1727204239.95549: checking to see if all hosts have failed and the running result is not ok 22736 1727204239.95550: done checking to see if all hosts have failed 22736 1727204239.95551: getting the remaining hosts for this loop 22736 1727204239.95552: done getting the remaining hosts for this loop 22736 1727204239.95555: getting the next task for host managed-node2 22736 1727204239.95559: done getting next task for host managed-node2 22736 1727204239.95560: ^ task is: None 22736 1727204239.95562: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204239.95563: done queuing things up, now waiting for results queue to drain 22736 1727204239.95564: results queue empty 22736 1727204239.95565: checking for any_errors_fatal 22736 1727204239.95566: done checking for any_errors_fatal 22736 1727204239.95567: checking for max_fail_percentage 22736 1727204239.95568: done checking for max_fail_percentage 22736 1727204239.95569: checking to see if all hosts have failed and the running result is not ok 22736 1727204239.95570: done checking to see if all hosts have failed 22736 1727204239.95572: getting the next task for host managed-node2 22736 1727204239.95575: done getting next task for host managed-node2 22736 1727204239.95576: ^ task is: None 22736 1727204239.95578: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204239.95810: in VariableManager get_vars() 22736 1727204239.95827: done with get_vars() 22736 1727204239.95834: in VariableManager get_vars() 22736 1727204239.95844: done with get_vars() 22736 1727204239.95851: variable 'omit' from source: magic vars 22736 1727204239.95888: in VariableManager get_vars() 22736 1727204239.96107: done with get_vars() 22736 1727204239.96133: variable 'omit' from source: magic vars PLAY [Play for showing the network provider] *********************************** 22736 1727204239.96405: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22736 1727204239.96655: getting the remaining hosts for this loop 22736 1727204239.96657: done getting the remaining hosts for this loop 22736 1727204239.96660: getting the next task for host managed-node2 22736 1727204239.96663: done getting next task for host managed-node2 22736 1727204239.96666: ^ task is: TASK: Gathering Facts 22736 1727204239.96668: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204239.96670: getting variables 22736 1727204239.96671: in VariableManager get_vars() 22736 1727204239.96680: Calling all_inventory to load vars for managed-node2 22736 1727204239.96683: Calling groups_inventory to load vars for managed-node2 22736 1727204239.96686: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204239.96695: Calling all_plugins_play to load vars for managed-node2 22736 1727204239.96712: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204239.96716: Calling groups_plugins_play to load vars for managed-node2 22736 1727204239.97215: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204239.97703: done with get_vars() 22736 1727204239.97713: done getting variables 22736 1727204239.97762: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Tuesday 24 September 2024 14:57:19 -0400 (0:00:00.076) 0:00:04.762 ***** 22736 1727204239.97881: entering _queue_task() for managed-node2/gather_facts 22736 1727204239.98482: worker is 1 (out of 1 available) 22736 1727204239.98497: exiting _queue_task() for managed-node2/gather_facts 22736 1727204239.98512: done queuing things up, now waiting for results queue to drain 22736 1727204239.98513: waiting for pending results... 22736 1727204239.98974: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22736 1727204239.99308: in run() - task 12b410aa-8751-4f4a-548a-0000000000d8 22736 1727204239.99315: variable 'ansible_search_path' from source: unknown 22736 1727204239.99355: calling self._execute() 22736 1727204239.99571: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204239.99586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204239.99605: variable 'omit' from source: magic vars 22736 1727204240.00771: variable 'ansible_distribution_major_version' from source: facts 22736 1727204240.00995: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204240.00998: variable 'omit' from source: magic vars 22736 1727204240.01001: variable 'omit' from source: magic vars 22736 1727204240.01003: variable 'omit' from source: magic vars 22736 1727204240.01080: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204240.01191: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204240.01278: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204240.01308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204240.01378: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204240.01421: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204240.01477: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204240.01486: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204240.01797: Set connection var ansible_timeout to 10 22736 1727204240.01824: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204240.01839: Set connection var ansible_shell_executable to /bin/sh 22736 1727204240.01849: Set connection var ansible_shell_type to sh 22736 1727204240.01923: Set connection var ansible_pipelining to False 22736 1727204240.01926: Set connection var ansible_connection to ssh 22736 1727204240.01956: variable 'ansible_shell_executable' from source: unknown 22736 1727204240.01959: variable 'ansible_connection' from source: unknown 22736 1727204240.01962: variable 'ansible_module_compression' from source: unknown 22736 1727204240.01965: variable 'ansible_shell_type' from source: unknown 22736 1727204240.01968: variable 'ansible_shell_executable' from source: unknown 22736 1727204240.02095: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204240.02099: variable 'ansible_pipelining' from source: unknown 22736 1727204240.02102: variable 'ansible_timeout' from source: unknown 22736 1727204240.02104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204240.02286: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204240.02314: variable 'omit' from source: magic vars 22736 1727204240.02325: starting attempt loop 22736 1727204240.02333: running the handler 22736 1727204240.02363: variable 'ansible_facts' from source: unknown 22736 1727204240.02390: _low_level_execute_command(): starting 22736 1727204240.02405: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204240.03309: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204240.03457: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204240.03462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22736 1727204240.03486: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204240.03510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204240.03691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 22736 1727204240.06223: stdout chunk (state=3): >>>/root <<< 22736 1727204240.06499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204240.06502: stdout chunk (state=3): >>><<< 22736 1727204240.06527: stderr chunk (state=3): >>><<< 22736 1727204240.06828: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 22736 1727204240.06833: _low_level_execute_command(): starting 22736 1727204240.06836: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204240.067174-22962-199185233731156 `" && echo ansible-tmp-1727204240.067174-22962-199185233731156="` echo /root/.ansible/tmp/ansible-tmp-1727204240.067174-22962-199185233731156 `" ) && sleep 0' 22736 1727204240.07959: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204240.08007: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204240.08208: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204240.08224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204240.08407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204240.10727: stdout chunk (state=3): >>>ansible-tmp-1727204240.067174-22962-199185233731156=/root/.ansible/tmp/ansible-tmp-1727204240.067174-22962-199185233731156 <<< 22736 1727204240.10956: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204240.10968: stdout chunk (state=3): >>><<< 22736 1727204240.10980: stderr chunk (state=3): >>><<< 22736 1727204240.11098: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204240.067174-22962-199185233731156=/root/.ansible/tmp/ansible-tmp-1727204240.067174-22962-199185233731156 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204240.11102: variable 'ansible_module_compression' from source: unknown 22736 1727204240.11396: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22736 1727204240.11399: variable 'ansible_facts' from source: unknown 22736 1727204240.11595: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204240.067174-22962-199185233731156/AnsiballZ_setup.py 22736 1727204240.12425: Sending initial data 22736 1727204240.12507: Sent initial data (153 bytes) 22736 1727204240.14568: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204240.14587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204240.14754: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204240.14810: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204240.15220: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204240.17045: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204240.17103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204240.17188: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmptogad37o /root/.ansible/tmp/ansible-tmp-1727204240.067174-22962-199185233731156/AnsiballZ_setup.py <<< 22736 1727204240.17195: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204240.067174-22962-199185233731156/AnsiballZ_setup.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmptogad37o" to remote "/root/.ansible/tmp/ansible-tmp-1727204240.067174-22962-199185233731156/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204240.067174-22962-199185233731156/AnsiballZ_setup.py" <<< 22736 1727204240.20496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204240.20500: stderr chunk (state=3): >>><<< 22736 1727204240.20502: stdout chunk (state=3): >>><<< 22736 1727204240.20505: done transferring module to remote 22736 1727204240.20507: _low_level_execute_command(): starting 22736 1727204240.20511: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204240.067174-22962-199185233731156/ /root/.ansible/tmp/ansible-tmp-1727204240.067174-22962-199185233731156/AnsiballZ_setup.py && sleep 0' 22736 1727204240.21509: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204240.21513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204240.21516: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204240.21519: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204240.21630: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204240.21708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204240.24233: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204240.24237: stdout chunk (state=3): >>><<< 22736 1727204240.24240: stderr chunk (state=3): >>><<< 22736 1727204240.24244: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204240.24250: _low_level_execute_command(): starting 22736 1727204240.24252: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204240.067174-22962-199185233731156/AnsiballZ_setup.py && sleep 0' 22736 1727204240.25302: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204240.25330: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204240.25357: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204240.25378: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204240.25400: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204240.25415: stderr chunk (state=3): >>>debug2: match not found <<< 22736 1727204240.25433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204240.25502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204240.25546: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204240.25564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204240.25765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204240.25826: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204241.09352: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 1.02001953125, "5m": 0.67578125, "15m": 0.40625}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [<<< 22736 1727204241.09361: stdout chunk (state=3): >>>], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2829, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 888, "free": 2829}, "nocache": {"free": 3459, "used": 258}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 744, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251147087872, "block_size": 4096, "block_total": 64479564, "block_available": 61315207, "block_used": 3164357, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_local": {}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b"<<< 22736 1727204241.09378: stdout chunk (state=3): >>>, "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "21", "epoch": "1727204241", "epoch_int": "1727204241", "date": "2024-09-24", "time": "14:57:21", "iso8601_micro": "2024-09-24T18:57:21.090083Z", "iso8601": "2024-09-24T18:57:21Z", "iso8601_basic": "20240924T145721090083", "iso8601_basic_short": "20240924T145721", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22736 1727204241.11628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204241.11694: stderr chunk (state=3): >>><<< 22736 1727204241.11699: stdout chunk (state=3): >>><<< 22736 1727204241.11735: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fibre_channel_wwn": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 1.02001953125, "5m": 0.67578125, "15m": 0.40625}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2829, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 888, "free": 2829}, "nocache": {"free": 3459, "used": 258}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 744, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251147087872, "block_size": 4096, "block_total": 64479564, "block_available": 61315207, "block_used": 3164357, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_local": {}, "ansible_lsb": {}, "ansible_pkg_mgr": "dnf", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "21", "epoch": "1727204241", "epoch_int": "1727204241", "date": "2024-09-24", "time": "14:57:21", "iso8601_micro": "2024-09-24T18:57:21.090083Z", "iso8601": "2024-09-24T18:57:21Z", "iso8601_basic": "20240924T145721090083", "iso8601_basic_short": "20240924T145721", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_service_mgr": "systemd", "ansible_fips": false, "ansible_iscsi_iqn": "", "ansible_apparmor": {"status": "disabled"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204241.11963: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204240.067174-22962-199185233731156/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204241.11987: _low_level_execute_command(): starting 22736 1727204241.12030: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204240.067174-22962-199185233731156/ > /dev/null 2>&1 && sleep 0' 22736 1727204241.12495: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204241.12499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204241.12501: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204241.12504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204241.12560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204241.12563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204241.12614: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204241.14561: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204241.14614: stderr chunk (state=3): >>><<< 22736 1727204241.14620: stdout chunk (state=3): >>><<< 22736 1727204241.14640: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204241.14648: handler run complete 22736 1727204241.14762: variable 'ansible_facts' from source: unknown 22736 1727204241.14846: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204241.15103: variable 'ansible_facts' from source: unknown 22736 1727204241.15175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204241.15281: attempt loop complete, returning result 22736 1727204241.15286: _execute() done 22736 1727204241.15289: dumping result to json 22736 1727204241.15314: done dumping result, returning 22736 1727204241.15327: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-4f4a-548a-0000000000d8] 22736 1727204241.15330: sending task result for task 12b410aa-8751-4f4a-548a-0000000000d8 22736 1727204241.15628: done sending task result for task 12b410aa-8751-4f4a-548a-0000000000d8 22736 1727204241.15631: WORKER PROCESS EXITING ok: [managed-node2] 22736 1727204241.15871: no more pending results, returning what we have 22736 1727204241.15873: results queue empty 22736 1727204241.15874: checking for any_errors_fatal 22736 1727204241.15876: done checking for any_errors_fatal 22736 1727204241.15877: checking for max_fail_percentage 22736 1727204241.15878: done checking for max_fail_percentage 22736 1727204241.15879: checking to see if all hosts have failed and the running result is not ok 22736 1727204241.15880: done checking to see if all hosts have failed 22736 1727204241.15881: getting the remaining hosts for this loop 22736 1727204241.15882: done getting the remaining hosts for this loop 22736 1727204241.15884: getting the next task for host managed-node2 22736 1727204241.15891: done getting next task for host managed-node2 22736 1727204241.15892: ^ task is: TASK: meta (flush_handlers) 22736 1727204241.15894: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204241.15897: getting variables 22736 1727204241.15898: in VariableManager get_vars() 22736 1727204241.15916: Calling all_inventory to load vars for managed-node2 22736 1727204241.15918: Calling groups_inventory to load vars for managed-node2 22736 1727204241.15920: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204241.15929: Calling all_plugins_play to load vars for managed-node2 22736 1727204241.15931: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204241.15933: Calling groups_plugins_play to load vars for managed-node2 22736 1727204241.16059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204241.16227: done with get_vars() 22736 1727204241.16235: done getting variables 22736 1727204241.16291: in VariableManager get_vars() 22736 1727204241.16299: Calling all_inventory to load vars for managed-node2 22736 1727204241.16300: Calling groups_inventory to load vars for managed-node2 22736 1727204241.16302: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204241.16306: Calling all_plugins_play to load vars for managed-node2 22736 1727204241.16308: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204241.16312: Calling groups_plugins_play to load vars for managed-node2 22736 1727204241.16423: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204241.16567: done with get_vars() 22736 1727204241.16577: done queuing things up, now waiting for results queue to drain 22736 1727204241.16578: results queue empty 22736 1727204241.16579: checking for any_errors_fatal 22736 1727204241.16582: done checking for any_errors_fatal 22736 1727204241.16582: checking for max_fail_percentage 22736 1727204241.16583: done checking for max_fail_percentage 22736 1727204241.16583: checking to see if all hosts have failed and the running result is not ok 22736 1727204241.16588: done checking to see if all hosts have failed 22736 1727204241.16590: getting the remaining hosts for this loop 22736 1727204241.16592: done getting the remaining hosts for this loop 22736 1727204241.16593: getting the next task for host managed-node2 22736 1727204241.16596: done getting next task for host managed-node2 22736 1727204241.16598: ^ task is: TASK: Show inside ethernet tests 22736 1727204241.16599: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204241.16600: getting variables 22736 1727204241.16601: in VariableManager get_vars() 22736 1727204241.16607: Calling all_inventory to load vars for managed-node2 22736 1727204241.16609: Calling groups_inventory to load vars for managed-node2 22736 1727204241.16611: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204241.16615: Calling all_plugins_play to load vars for managed-node2 22736 1727204241.16617: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204241.16619: Calling groups_plugins_play to load vars for managed-node2 22736 1727204241.16728: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204241.16877: done with get_vars() 22736 1727204241.16883: done getting variables 22736 1727204241.16946: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Show inside ethernet tests] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:6 Tuesday 24 September 2024 14:57:21 -0400 (0:00:01.191) 0:00:05.954 ***** 22736 1727204241.16968: entering _queue_task() for managed-node2/debug 22736 1727204241.16970: Creating lock for debug 22736 1727204241.17197: worker is 1 (out of 1 available) 22736 1727204241.17210: exiting _queue_task() for managed-node2/debug 22736 1727204241.17222: done queuing things up, now waiting for results queue to drain 22736 1727204241.17225: waiting for pending results... 22736 1727204241.17383: running TaskExecutor() for managed-node2/TASK: Show inside ethernet tests 22736 1727204241.17455: in run() - task 12b410aa-8751-4f4a-548a-00000000000b 22736 1727204241.17461: variable 'ansible_search_path' from source: unknown 22736 1727204241.17497: calling self._execute() 22736 1727204241.17563: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204241.17578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204241.17585: variable 'omit' from source: magic vars 22736 1727204241.17964: variable 'ansible_distribution_major_version' from source: facts 22736 1727204241.17975: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204241.17981: variable 'omit' from source: magic vars 22736 1727204241.18011: variable 'omit' from source: magic vars 22736 1727204241.18044: variable 'omit' from source: magic vars 22736 1727204241.18076: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204241.18110: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204241.18132: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204241.18148: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204241.18159: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204241.18186: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204241.18192: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204241.18194: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204241.18280: Set connection var ansible_timeout to 10 22736 1727204241.18292: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204241.18301: Set connection var ansible_shell_executable to /bin/sh 22736 1727204241.18304: Set connection var ansible_shell_type to sh 22736 1727204241.18310: Set connection var ansible_pipelining to False 22736 1727204241.18313: Set connection var ansible_connection to ssh 22736 1727204241.18340: variable 'ansible_shell_executable' from source: unknown 22736 1727204241.18343: variable 'ansible_connection' from source: unknown 22736 1727204241.18347: variable 'ansible_module_compression' from source: unknown 22736 1727204241.18350: variable 'ansible_shell_type' from source: unknown 22736 1727204241.18355: variable 'ansible_shell_executable' from source: unknown 22736 1727204241.18358: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204241.18363: variable 'ansible_pipelining' from source: unknown 22736 1727204241.18366: variable 'ansible_timeout' from source: unknown 22736 1727204241.18372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204241.18491: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204241.18500: variable 'omit' from source: magic vars 22736 1727204241.18505: starting attempt loop 22736 1727204241.18508: running the handler 22736 1727204241.18554: handler run complete 22736 1727204241.18574: attempt loop complete, returning result 22736 1727204241.18577: _execute() done 22736 1727204241.18580: dumping result to json 22736 1727204241.18585: done dumping result, returning 22736 1727204241.18595: done running TaskExecutor() for managed-node2/TASK: Show inside ethernet tests [12b410aa-8751-4f4a-548a-00000000000b] 22736 1727204241.18598: sending task result for task 12b410aa-8751-4f4a-548a-00000000000b 22736 1727204241.18691: done sending task result for task 12b410aa-8751-4f4a-548a-00000000000b 22736 1727204241.18694: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Inside ethernet tests 22736 1727204241.18748: no more pending results, returning what we have 22736 1727204241.18752: results queue empty 22736 1727204241.18753: checking for any_errors_fatal 22736 1727204241.18755: done checking for any_errors_fatal 22736 1727204241.18756: checking for max_fail_percentage 22736 1727204241.18758: done checking for max_fail_percentage 22736 1727204241.18759: checking to see if all hosts have failed and the running result is not ok 22736 1727204241.18760: done checking to see if all hosts have failed 22736 1727204241.18761: getting the remaining hosts for this loop 22736 1727204241.18762: done getting the remaining hosts for this loop 22736 1727204241.18766: getting the next task for host managed-node2 22736 1727204241.18771: done getting next task for host managed-node2 22736 1727204241.18774: ^ task is: TASK: Show network_provider 22736 1727204241.18777: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204241.18779: getting variables 22736 1727204241.18781: in VariableManager get_vars() 22736 1727204241.18864: Calling all_inventory to load vars for managed-node2 22736 1727204241.18867: Calling groups_inventory to load vars for managed-node2 22736 1727204241.18875: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204241.18883: Calling all_plugins_play to load vars for managed-node2 22736 1727204241.18885: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204241.18888: Calling groups_plugins_play to load vars for managed-node2 22736 1727204241.19011: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204241.19164: done with get_vars() 22736 1727204241.19171: done getting variables 22736 1727204241.19218: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:9 Tuesday 24 September 2024 14:57:21 -0400 (0:00:00.022) 0:00:05.977 ***** 22736 1727204241.19239: entering _queue_task() for managed-node2/debug 22736 1727204241.19433: worker is 1 (out of 1 available) 22736 1727204241.19447: exiting _queue_task() for managed-node2/debug 22736 1727204241.19459: done queuing things up, now waiting for results queue to drain 22736 1727204241.19460: waiting for pending results... 22736 1727204241.19611: running TaskExecutor() for managed-node2/TASK: Show network_provider 22736 1727204241.19667: in run() - task 12b410aa-8751-4f4a-548a-00000000000c 22736 1727204241.19679: variable 'ansible_search_path' from source: unknown 22736 1727204241.19715: calling self._execute() 22736 1727204241.19781: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204241.19788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204241.19805: variable 'omit' from source: magic vars 22736 1727204241.20100: variable 'ansible_distribution_major_version' from source: facts 22736 1727204241.20110: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204241.20122: variable 'omit' from source: magic vars 22736 1727204241.20148: variable 'omit' from source: magic vars 22736 1727204241.20178: variable 'omit' from source: magic vars 22736 1727204241.20213: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204241.20249: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204241.20264: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204241.20280: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204241.20293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204241.20322: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204241.20325: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204241.20330: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204241.20413: Set connection var ansible_timeout to 10 22736 1727204241.20426: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204241.20434: Set connection var ansible_shell_executable to /bin/sh 22736 1727204241.20438: Set connection var ansible_shell_type to sh 22736 1727204241.20444: Set connection var ansible_pipelining to False 22736 1727204241.20447: Set connection var ansible_connection to ssh 22736 1727204241.20470: variable 'ansible_shell_executable' from source: unknown 22736 1727204241.20473: variable 'ansible_connection' from source: unknown 22736 1727204241.20476: variable 'ansible_module_compression' from source: unknown 22736 1727204241.20479: variable 'ansible_shell_type' from source: unknown 22736 1727204241.20482: variable 'ansible_shell_executable' from source: unknown 22736 1727204241.20487: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204241.20494: variable 'ansible_pipelining' from source: unknown 22736 1727204241.20497: variable 'ansible_timeout' from source: unknown 22736 1727204241.20503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204241.20621: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204241.20630: variable 'omit' from source: magic vars 22736 1727204241.20634: starting attempt loop 22736 1727204241.20639: running the handler 22736 1727204241.20681: variable 'network_provider' from source: set_fact 22736 1727204241.20747: variable 'network_provider' from source: set_fact 22736 1727204241.20768: handler run complete 22736 1727204241.20786: attempt loop complete, returning result 22736 1727204241.20793: _execute() done 22736 1727204241.20800: dumping result to json 22736 1727204241.20803: done dumping result, returning 22736 1727204241.20811: done running TaskExecutor() for managed-node2/TASK: Show network_provider [12b410aa-8751-4f4a-548a-00000000000c] 22736 1727204241.20816: sending task result for task 12b410aa-8751-4f4a-548a-00000000000c 22736 1727204241.20901: done sending task result for task 12b410aa-8751-4f4a-548a-00000000000c 22736 1727204241.20905: WORKER PROCESS EXITING ok: [managed-node2] => { "network_provider": "nm" } 22736 1727204241.20961: no more pending results, returning what we have 22736 1727204241.20964: results queue empty 22736 1727204241.20965: checking for any_errors_fatal 22736 1727204241.20970: done checking for any_errors_fatal 22736 1727204241.20971: checking for max_fail_percentage 22736 1727204241.20973: done checking for max_fail_percentage 22736 1727204241.20974: checking to see if all hosts have failed and the running result is not ok 22736 1727204241.20975: done checking to see if all hosts have failed 22736 1727204241.20975: getting the remaining hosts for this loop 22736 1727204241.20977: done getting the remaining hosts for this loop 22736 1727204241.20981: getting the next task for host managed-node2 22736 1727204241.20987: done getting next task for host managed-node2 22736 1727204241.20991: ^ task is: TASK: meta (flush_handlers) 22736 1727204241.20994: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204241.20997: getting variables 22736 1727204241.20998: in VariableManager get_vars() 22736 1727204241.21025: Calling all_inventory to load vars for managed-node2 22736 1727204241.21028: Calling groups_inventory to load vars for managed-node2 22736 1727204241.21031: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204241.21041: Calling all_plugins_play to load vars for managed-node2 22736 1727204241.21044: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204241.21047: Calling groups_plugins_play to load vars for managed-node2 22736 1727204241.21200: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204241.21374: done with get_vars() 22736 1727204241.21381: done getting variables 22736 1727204241.21437: in VariableManager get_vars() 22736 1727204241.21444: Calling all_inventory to load vars for managed-node2 22736 1727204241.21445: Calling groups_inventory to load vars for managed-node2 22736 1727204241.21447: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204241.21450: Calling all_plugins_play to load vars for managed-node2 22736 1727204241.21452: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204241.21454: Calling groups_plugins_play to load vars for managed-node2 22736 1727204241.21566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204241.21716: done with get_vars() 22736 1727204241.21726: done queuing things up, now waiting for results queue to drain 22736 1727204241.21727: results queue empty 22736 1727204241.21728: checking for any_errors_fatal 22736 1727204241.21729: done checking for any_errors_fatal 22736 1727204241.21730: checking for max_fail_percentage 22736 1727204241.21731: done checking for max_fail_percentage 22736 1727204241.21731: checking to see if all hosts have failed and the running result is not ok 22736 1727204241.21732: done checking to see if all hosts have failed 22736 1727204241.21732: getting the remaining hosts for this loop 22736 1727204241.21733: done getting the remaining hosts for this loop 22736 1727204241.21735: getting the next task for host managed-node2 22736 1727204241.21741: done getting next task for host managed-node2 22736 1727204241.21742: ^ task is: TASK: meta (flush_handlers) 22736 1727204241.21743: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204241.21746: getting variables 22736 1727204241.21747: in VariableManager get_vars() 22736 1727204241.21755: Calling all_inventory to load vars for managed-node2 22736 1727204241.21756: Calling groups_inventory to load vars for managed-node2 22736 1727204241.21758: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204241.21762: Calling all_plugins_play to load vars for managed-node2 22736 1727204241.21764: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204241.21766: Calling groups_plugins_play to load vars for managed-node2 22736 1727204241.21875: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204241.22041: done with get_vars() 22736 1727204241.22048: done getting variables 22736 1727204241.22084: in VariableManager get_vars() 22736 1727204241.22092: Calling all_inventory to load vars for managed-node2 22736 1727204241.22094: Calling groups_inventory to load vars for managed-node2 22736 1727204241.22096: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204241.22099: Calling all_plugins_play to load vars for managed-node2 22736 1727204241.22101: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204241.22103: Calling groups_plugins_play to load vars for managed-node2 22736 1727204241.22210: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204241.22361: done with get_vars() 22736 1727204241.22370: done queuing things up, now waiting for results queue to drain 22736 1727204241.22372: results queue empty 22736 1727204241.22372: checking for any_errors_fatal 22736 1727204241.22373: done checking for any_errors_fatal 22736 1727204241.22374: checking for max_fail_percentage 22736 1727204241.22375: done checking for max_fail_percentage 22736 1727204241.22375: checking to see if all hosts have failed and the running result is not ok 22736 1727204241.22376: done checking to see if all hosts have failed 22736 1727204241.22376: getting the remaining hosts for this loop 22736 1727204241.22377: done getting the remaining hosts for this loop 22736 1727204241.22379: getting the next task for host managed-node2 22736 1727204241.22381: done getting next task for host managed-node2 22736 1727204241.22381: ^ task is: None 22736 1727204241.22382: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204241.22383: done queuing things up, now waiting for results queue to drain 22736 1727204241.22384: results queue empty 22736 1727204241.22384: checking for any_errors_fatal 22736 1727204241.22385: done checking for any_errors_fatal 22736 1727204241.22386: checking for max_fail_percentage 22736 1727204241.22386: done checking for max_fail_percentage 22736 1727204241.22387: checking to see if all hosts have failed and the running result is not ok 22736 1727204241.22387: done checking to see if all hosts have failed 22736 1727204241.22390: getting the next task for host managed-node2 22736 1727204241.22392: done getting next task for host managed-node2 22736 1727204241.22393: ^ task is: None 22736 1727204241.22394: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204241.22426: in VariableManager get_vars() 22736 1727204241.22438: done with get_vars() 22736 1727204241.22442: in VariableManager get_vars() 22736 1727204241.22449: done with get_vars() 22736 1727204241.22452: variable 'omit' from source: magic vars 22736 1727204241.22473: in VariableManager get_vars() 22736 1727204241.22480: done with get_vars() 22736 1727204241.22497: variable 'omit' from source: magic vars PLAY [Test configuring ethernet devices] *************************************** 22736 1727204241.22645: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22736 1727204241.22664: getting the remaining hosts for this loop 22736 1727204241.22666: done getting the remaining hosts for this loop 22736 1727204241.22668: getting the next task for host managed-node2 22736 1727204241.22670: done getting next task for host managed-node2 22736 1727204241.22671: ^ task is: TASK: Gathering Facts 22736 1727204241.22672: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204241.22673: getting variables 22736 1727204241.22674: in VariableManager get_vars() 22736 1727204241.22680: Calling all_inventory to load vars for managed-node2 22736 1727204241.22682: Calling groups_inventory to load vars for managed-node2 22736 1727204241.22684: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204241.22688: Calling all_plugins_play to load vars for managed-node2 22736 1727204241.22691: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204241.22694: Calling groups_plugins_play to load vars for managed-node2 22736 1727204241.22831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204241.22980: done with get_vars() 22736 1727204241.22987: done getting variables 22736 1727204241.23020: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Tuesday 24 September 2024 14:57:21 -0400 (0:00:00.037) 0:00:06.015 ***** 22736 1727204241.23039: entering _queue_task() for managed-node2/gather_facts 22736 1727204241.23240: worker is 1 (out of 1 available) 22736 1727204241.23254: exiting _queue_task() for managed-node2/gather_facts 22736 1727204241.23267: done queuing things up, now waiting for results queue to drain 22736 1727204241.23268: waiting for pending results... 22736 1727204241.23424: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22736 1727204241.23479: in run() - task 12b410aa-8751-4f4a-548a-0000000000f0 22736 1727204241.23498: variable 'ansible_search_path' from source: unknown 22736 1727204241.23529: calling self._execute() 22736 1727204241.23594: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204241.23603: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204241.23617: variable 'omit' from source: magic vars 22736 1727204241.23920: variable 'ansible_distribution_major_version' from source: facts 22736 1727204241.23928: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204241.23941: variable 'omit' from source: magic vars 22736 1727204241.23962: variable 'omit' from source: magic vars 22736 1727204241.23992: variable 'omit' from source: magic vars 22736 1727204241.24026: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204241.24060: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204241.24077: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204241.24095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204241.24107: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204241.24134: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204241.24138: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204241.24142: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204241.24229: Set connection var ansible_timeout to 10 22736 1727204241.24239: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204241.24248: Set connection var ansible_shell_executable to /bin/sh 22736 1727204241.24251: Set connection var ansible_shell_type to sh 22736 1727204241.24259: Set connection var ansible_pipelining to False 22736 1727204241.24262: Set connection var ansible_connection to ssh 22736 1727204241.24284: variable 'ansible_shell_executable' from source: unknown 22736 1727204241.24287: variable 'ansible_connection' from source: unknown 22736 1727204241.24291: variable 'ansible_module_compression' from source: unknown 22736 1727204241.24294: variable 'ansible_shell_type' from source: unknown 22736 1727204241.24299: variable 'ansible_shell_executable' from source: unknown 22736 1727204241.24303: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204241.24308: variable 'ansible_pipelining' from source: unknown 22736 1727204241.24314: variable 'ansible_timeout' from source: unknown 22736 1727204241.24317: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204241.24466: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204241.24477: variable 'omit' from source: magic vars 22736 1727204241.24482: starting attempt loop 22736 1727204241.24486: running the handler 22736 1727204241.24504: variable 'ansible_facts' from source: unknown 22736 1727204241.24522: _low_level_execute_command(): starting 22736 1727204241.24529: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204241.25082: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204241.25088: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204241.25094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204241.25155: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204241.25162: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204241.25166: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204241.25205: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204241.26952: stdout chunk (state=3): >>>/root <<< 22736 1727204241.27061: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204241.27120: stderr chunk (state=3): >>><<< 22736 1727204241.27124: stdout chunk (state=3): >>><<< 22736 1727204241.27147: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204241.27158: _low_level_execute_command(): starting 22736 1727204241.27164: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204241.2714574-23059-106117864417260 `" && echo ansible-tmp-1727204241.2714574-23059-106117864417260="` echo /root/.ansible/tmp/ansible-tmp-1727204241.2714574-23059-106117864417260 `" ) && sleep 0' 22736 1727204241.27630: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204241.27634: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204241.27636: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 22736 1727204241.27648: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204241.27698: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204241.27702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204241.27746: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204241.29733: stdout chunk (state=3): >>>ansible-tmp-1727204241.2714574-23059-106117864417260=/root/.ansible/tmp/ansible-tmp-1727204241.2714574-23059-106117864417260 <<< 22736 1727204241.29848: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204241.29899: stderr chunk (state=3): >>><<< 22736 1727204241.29903: stdout chunk (state=3): >>><<< 22736 1727204241.29925: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204241.2714574-23059-106117864417260=/root/.ansible/tmp/ansible-tmp-1727204241.2714574-23059-106117864417260 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204241.29954: variable 'ansible_module_compression' from source: unknown 22736 1727204241.29996: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22736 1727204241.30051: variable 'ansible_facts' from source: unknown 22736 1727204241.30173: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204241.2714574-23059-106117864417260/AnsiballZ_setup.py 22736 1727204241.30297: Sending initial data 22736 1727204241.30301: Sent initial data (154 bytes) 22736 1727204241.30775: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204241.30778: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204241.30781: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204241.30785: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204241.30787: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204241.30840: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204241.30847: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204241.30884: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204241.32493: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 22736 1727204241.32502: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204241.32533: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204241.32565: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpj0hd8lc3 /root/.ansible/tmp/ansible-tmp-1727204241.2714574-23059-106117864417260/AnsiballZ_setup.py <<< 22736 1727204241.32572: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204241.2714574-23059-106117864417260/AnsiballZ_setup.py" <<< 22736 1727204241.32605: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpj0hd8lc3" to remote "/root/.ansible/tmp/ansible-tmp-1727204241.2714574-23059-106117864417260/AnsiballZ_setup.py" <<< 22736 1727204241.32609: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204241.2714574-23059-106117864417260/AnsiballZ_setup.py" <<< 22736 1727204241.34284: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204241.34398: stderr chunk (state=3): >>><<< 22736 1727204241.34402: stdout chunk (state=3): >>><<< 22736 1727204241.34404: done transferring module to remote 22736 1727204241.34412: _low_level_execute_command(): starting 22736 1727204241.34417: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204241.2714574-23059-106117864417260/ /root/.ansible/tmp/ansible-tmp-1727204241.2714574-23059-106117864417260/AnsiballZ_setup.py && sleep 0' 22736 1727204241.34902: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204241.34906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204241.34909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204241.34911: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204241.34913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204241.34971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204241.34976: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204241.35014: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204241.36869: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204241.36926: stderr chunk (state=3): >>><<< 22736 1727204241.36929: stdout chunk (state=3): >>><<< 22736 1727204241.36945: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204241.36949: _low_level_execute_command(): starting 22736 1727204241.36954: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204241.2714574-23059-106117864417260/AnsiballZ_setup.py && sleep 0' 22736 1727204241.37426: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204241.37431: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204241.37434: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204241.37437: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204241.37490: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204241.37499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204241.37543: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204242.06820: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_hostnqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "21", "epoch": "1727204241", "epoch_int": "1727204241", "date": "2024-09-24", "time": "14:57:21", "iso8601_micro": "2024-09-24T18:57:21.691661Z", "iso8601": "2024-09-24T18:57:21Z", "iso8601_basic": "20240924T145721691661", "iso8601_basic_short": "20240924T145721", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2837, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 880, "free": 2837}, "nocache": {"free": 3467, "used": 250}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 745, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251147079680, "block_size": 4096, "block_total": 64479564, "block_available": 61315205, "block_used": 3164359, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "<<< 22736 1727204242.06832: stdout chunk (state=3): >>>type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.93798828125, "5m": 0.66455078125, "15m": 0.40380859375}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22736 1727204242.08977: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204242.09044: stderr chunk (state=3): >>><<< 22736 1727204242.09048: stdout chunk (state=3): >>><<< 22736 1727204242.09074: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_hostnqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "21", "epoch": "1727204241", "epoch_int": "1727204241", "date": "2024-09-24", "time": "14:57:21", "iso8601_micro": "2024-09-24T18:57:21.691661Z", "iso8601": "2024-09-24T18:57:21Z", "iso8601_basic": "20240924T145721691661", "iso8601_basic_short": "20240924T145721", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2837, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 880, "free": 2837}, "nocache": {"free": 3467, "used": 250}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 745, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251147079680, "block_size": 4096, "block_total": 64479564, "block_available": 61315205, "block_used": 3164359, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_service_mgr": "systemd", "ansible_loadavg": {"1m": 0.93798828125, "5m": 0.66455078125, "15m": 0.40380859375}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_iscsi_iqn": "", "ansible_fips": false, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204242.09352: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204241.2714574-23059-106117864417260/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204242.09373: _low_level_execute_command(): starting 22736 1727204242.09383: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204241.2714574-23059-106117864417260/ > /dev/null 2>&1 && sleep 0' 22736 1727204242.09905: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204242.09909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204242.09913: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 22736 1727204242.09916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204242.09918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204242.09975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204242.09982: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204242.09984: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204242.10022: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204242.36133: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204242.36138: stdout chunk (state=3): >>><<< 22736 1727204242.36140: stderr chunk (state=3): >>><<< 22736 1727204242.36159: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204242.36173: handler run complete 22736 1727204242.36384: variable 'ansible_facts' from source: unknown 22736 1727204242.36694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204242.37033: variable 'ansible_facts' from source: unknown 22736 1727204242.37160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204242.37366: attempt loop complete, returning result 22736 1727204242.37376: _execute() done 22736 1727204242.37384: dumping result to json 22736 1727204242.37425: done dumping result, returning 22736 1727204242.37438: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-4f4a-548a-0000000000f0] 22736 1727204242.37447: sending task result for task 12b410aa-8751-4f4a-548a-0000000000f0 22736 1727204242.37996: done sending task result for task 12b410aa-8751-4f4a-548a-0000000000f0 22736 1727204242.38005: WORKER PROCESS EXITING ok: [managed-node2] 22736 1727204242.38603: no more pending results, returning what we have 22736 1727204242.38607: results queue empty 22736 1727204242.38615: checking for any_errors_fatal 22736 1727204242.38617: done checking for any_errors_fatal 22736 1727204242.38618: checking for max_fail_percentage 22736 1727204242.38620: done checking for max_fail_percentage 22736 1727204242.38621: checking to see if all hosts have failed and the running result is not ok 22736 1727204242.38622: done checking to see if all hosts have failed 22736 1727204242.38623: getting the remaining hosts for this loop 22736 1727204242.38624: done getting the remaining hosts for this loop 22736 1727204242.38628: getting the next task for host managed-node2 22736 1727204242.38634: done getting next task for host managed-node2 22736 1727204242.38637: ^ task is: TASK: meta (flush_handlers) 22736 1727204242.38639: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204242.38643: getting variables 22736 1727204242.38644: in VariableManager get_vars() 22736 1727204242.38669: Calling all_inventory to load vars for managed-node2 22736 1727204242.38673: Calling groups_inventory to load vars for managed-node2 22736 1727204242.38677: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204242.38691: Calling all_plugins_play to load vars for managed-node2 22736 1727204242.38696: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204242.38700: Calling groups_plugins_play to load vars for managed-node2 22736 1727204242.38905: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204242.39178: done with get_vars() 22736 1727204242.39194: done getting variables 22736 1727204242.39285: in VariableManager get_vars() 22736 1727204242.39300: Calling all_inventory to load vars for managed-node2 22736 1727204242.39303: Calling groups_inventory to load vars for managed-node2 22736 1727204242.39306: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204242.39315: Calling all_plugins_play to load vars for managed-node2 22736 1727204242.39318: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204242.39322: Calling groups_plugins_play to load vars for managed-node2 22736 1727204242.39575: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204242.39902: done with get_vars() 22736 1727204242.39927: done queuing things up, now waiting for results queue to drain 22736 1727204242.39930: results queue empty 22736 1727204242.39931: checking for any_errors_fatal 22736 1727204242.39936: done checking for any_errors_fatal 22736 1727204242.39937: checking for max_fail_percentage 22736 1727204242.39938: done checking for max_fail_percentage 22736 1727204242.39946: checking to see if all hosts have failed and the running result is not ok 22736 1727204242.39947: done checking to see if all hosts have failed 22736 1727204242.39948: getting the remaining hosts for this loop 22736 1727204242.39949: done getting the remaining hosts for this loop 22736 1727204242.39952: getting the next task for host managed-node2 22736 1727204242.39957: done getting next task for host managed-node2 22736 1727204242.39960: ^ task is: TASK: Set type={{ type }} and interface={{ interface }} 22736 1727204242.39962: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204242.39965: getting variables 22736 1727204242.39966: in VariableManager get_vars() 22736 1727204242.39977: Calling all_inventory to load vars for managed-node2 22736 1727204242.39979: Calling groups_inventory to load vars for managed-node2 22736 1727204242.39982: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204242.39988: Calling all_plugins_play to load vars for managed-node2 22736 1727204242.39994: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204242.39998: Calling groups_plugins_play to load vars for managed-node2 22736 1727204242.40199: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204242.40506: done with get_vars() 22736 1727204242.40519: done getting variables 22736 1727204242.40570: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22736 1727204242.40755: variable 'type' from source: play vars 22736 1727204242.40761: variable 'interface' from source: play vars TASK [Set type=veth and interface=lsr27] *************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:20 Tuesday 24 September 2024 14:57:22 -0400 (0:00:01.177) 0:00:07.193 ***** 22736 1727204242.40824: entering _queue_task() for managed-node2/set_fact 22736 1727204242.41308: worker is 1 (out of 1 available) 22736 1727204242.41322: exiting _queue_task() for managed-node2/set_fact 22736 1727204242.41334: done queuing things up, now waiting for results queue to drain 22736 1727204242.41336: waiting for pending results... 22736 1727204242.41546: running TaskExecutor() for managed-node2/TASK: Set type=veth and interface=lsr27 22736 1727204242.41676: in run() - task 12b410aa-8751-4f4a-548a-00000000000f 22736 1727204242.41703: variable 'ansible_search_path' from source: unknown 22736 1727204242.41762: calling self._execute() 22736 1727204242.41864: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204242.41879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204242.41919: variable 'omit' from source: magic vars 22736 1727204242.42440: variable 'ansible_distribution_major_version' from source: facts 22736 1727204242.42444: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204242.42446: variable 'omit' from source: magic vars 22736 1727204242.42505: variable 'omit' from source: magic vars 22736 1727204242.42555: variable 'type' from source: play vars 22736 1727204242.42672: variable 'type' from source: play vars 22736 1727204242.42698: variable 'interface' from source: play vars 22736 1727204242.42841: variable 'interface' from source: play vars 22736 1727204242.42858: variable 'omit' from source: magic vars 22736 1727204242.42998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204242.43002: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204242.43027: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204242.43061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204242.43197: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204242.43203: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204242.43205: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204242.43208: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204242.43356: Set connection var ansible_timeout to 10 22736 1727204242.43377: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204242.43407: Set connection var ansible_shell_executable to /bin/sh 22736 1727204242.43424: Set connection var ansible_shell_type to sh 22736 1727204242.43443: Set connection var ansible_pipelining to False 22736 1727204242.43543: Set connection var ansible_connection to ssh 22736 1727204242.43547: variable 'ansible_shell_executable' from source: unknown 22736 1727204242.43549: variable 'ansible_connection' from source: unknown 22736 1727204242.43551: variable 'ansible_module_compression' from source: unknown 22736 1727204242.43553: variable 'ansible_shell_type' from source: unknown 22736 1727204242.43555: variable 'ansible_shell_executable' from source: unknown 22736 1727204242.43557: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204242.43559: variable 'ansible_pipelining' from source: unknown 22736 1727204242.43561: variable 'ansible_timeout' from source: unknown 22736 1727204242.43563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204242.43738: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204242.43763: variable 'omit' from source: magic vars 22736 1727204242.43773: starting attempt loop 22736 1727204242.43786: running the handler 22736 1727204242.43810: handler run complete 22736 1727204242.43870: attempt loop complete, returning result 22736 1727204242.43873: _execute() done 22736 1727204242.43876: dumping result to json 22736 1727204242.43878: done dumping result, returning 22736 1727204242.43880: done running TaskExecutor() for managed-node2/TASK: Set type=veth and interface=lsr27 [12b410aa-8751-4f4a-548a-00000000000f] 22736 1727204242.43883: sending task result for task 12b410aa-8751-4f4a-548a-00000000000f ok: [managed-node2] => { "ansible_facts": { "interface": "lsr27", "type": "veth" }, "changed": false } 22736 1727204242.44150: no more pending results, returning what we have 22736 1727204242.44154: results queue empty 22736 1727204242.44156: checking for any_errors_fatal 22736 1727204242.44160: done checking for any_errors_fatal 22736 1727204242.44161: checking for max_fail_percentage 22736 1727204242.44163: done checking for max_fail_percentage 22736 1727204242.44163: checking to see if all hosts have failed and the running result is not ok 22736 1727204242.44165: done checking to see if all hosts have failed 22736 1727204242.44166: getting the remaining hosts for this loop 22736 1727204242.44168: done getting the remaining hosts for this loop 22736 1727204242.44173: getting the next task for host managed-node2 22736 1727204242.44180: done getting next task for host managed-node2 22736 1727204242.44184: ^ task is: TASK: Include the task 'show_interfaces.yml' 22736 1727204242.44186: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204242.44192: getting variables 22736 1727204242.44194: in VariableManager get_vars() 22736 1727204242.44341: Calling all_inventory to load vars for managed-node2 22736 1727204242.44357: Calling groups_inventory to load vars for managed-node2 22736 1727204242.44377: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204242.44386: done sending task result for task 12b410aa-8751-4f4a-548a-00000000000f 22736 1727204242.44391: WORKER PROCESS EXITING 22736 1727204242.44401: Calling all_plugins_play to load vars for managed-node2 22736 1727204242.44405: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204242.44409: Calling groups_plugins_play to load vars for managed-node2 22736 1727204242.44645: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204242.44817: done with get_vars() 22736 1727204242.44826: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:24 Tuesday 24 September 2024 14:57:22 -0400 (0:00:00.040) 0:00:07.233 ***** 22736 1727204242.44900: entering _queue_task() for managed-node2/include_tasks 22736 1727204242.45130: worker is 1 (out of 1 available) 22736 1727204242.45143: exiting _queue_task() for managed-node2/include_tasks 22736 1727204242.45157: done queuing things up, now waiting for results queue to drain 22736 1727204242.45158: waiting for pending results... 22736 1727204242.45308: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 22736 1727204242.45392: in run() - task 12b410aa-8751-4f4a-548a-000000000010 22736 1727204242.45497: variable 'ansible_search_path' from source: unknown 22736 1727204242.45503: calling self._execute() 22736 1727204242.45506: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204242.45510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204242.45515: variable 'omit' from source: magic vars 22736 1727204242.45847: variable 'ansible_distribution_major_version' from source: facts 22736 1727204242.45865: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204242.45879: _execute() done 22736 1727204242.45888: dumping result to json 22736 1727204242.45901: done dumping result, returning 22736 1727204242.45906: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-4f4a-548a-000000000010] 22736 1727204242.45916: sending task result for task 12b410aa-8751-4f4a-548a-000000000010 22736 1727204242.46046: no more pending results, returning what we have 22736 1727204242.46052: in VariableManager get_vars() 22736 1727204242.46086: Calling all_inventory to load vars for managed-node2 22736 1727204242.46091: Calling groups_inventory to load vars for managed-node2 22736 1727204242.46095: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204242.46112: Calling all_plugins_play to load vars for managed-node2 22736 1727204242.46118: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204242.46122: Calling groups_plugins_play to load vars for managed-node2 22736 1727204242.46464: done sending task result for task 12b410aa-8751-4f4a-548a-000000000010 22736 1727204242.46468: WORKER PROCESS EXITING 22736 1727204242.46593: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204242.46981: done with get_vars() 22736 1727204242.47002: variable 'ansible_search_path' from source: unknown 22736 1727204242.47020: we have included files to process 22736 1727204242.47022: generating all_blocks data 22736 1727204242.47024: done generating all_blocks data 22736 1727204242.47025: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22736 1727204242.47026: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22736 1727204242.47030: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22736 1727204242.47323: in VariableManager get_vars() 22736 1727204242.47343: done with get_vars() 22736 1727204242.47528: done processing included file 22736 1727204242.47531: iterating over new_blocks loaded from include file 22736 1727204242.47533: in VariableManager get_vars() 22736 1727204242.47705: done with get_vars() 22736 1727204242.47708: filtering new block on tags 22736 1727204242.47732: done filtering new block on tags 22736 1727204242.47735: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 22736 1727204242.47740: extending task lists for all hosts with included blocks 22736 1727204242.47955: done extending task lists 22736 1727204242.47957: done processing included files 22736 1727204242.47958: results queue empty 22736 1727204242.47959: checking for any_errors_fatal 22736 1727204242.47962: done checking for any_errors_fatal 22736 1727204242.47963: checking for max_fail_percentage 22736 1727204242.47965: done checking for max_fail_percentage 22736 1727204242.47966: checking to see if all hosts have failed and the running result is not ok 22736 1727204242.47967: done checking to see if all hosts have failed 22736 1727204242.47968: getting the remaining hosts for this loop 22736 1727204242.47969: done getting the remaining hosts for this loop 22736 1727204242.47972: getting the next task for host managed-node2 22736 1727204242.47976: done getting next task for host managed-node2 22736 1727204242.47979: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 22736 1727204242.47983: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204242.47985: getting variables 22736 1727204242.47986: in VariableManager get_vars() 22736 1727204242.47999: Calling all_inventory to load vars for managed-node2 22736 1727204242.48002: Calling groups_inventory to load vars for managed-node2 22736 1727204242.48005: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204242.48011: Calling all_plugins_play to load vars for managed-node2 22736 1727204242.48014: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204242.48018: Calling groups_plugins_play to load vars for managed-node2 22736 1727204242.48257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204242.48520: done with get_vars() 22736 1727204242.48530: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:57:22 -0400 (0:00:00.037) 0:00:07.271 ***** 22736 1727204242.48615: entering _queue_task() for managed-node2/include_tasks 22736 1727204242.48887: worker is 1 (out of 1 available) 22736 1727204242.48904: exiting _queue_task() for managed-node2/include_tasks 22736 1727204242.48918: done queuing things up, now waiting for results queue to drain 22736 1727204242.48920: waiting for pending results... 22736 1727204242.49157: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 22736 1727204242.49274: in run() - task 12b410aa-8751-4f4a-548a-000000000104 22736 1727204242.49300: variable 'ansible_search_path' from source: unknown 22736 1727204242.49311: variable 'ansible_search_path' from source: unknown 22736 1727204242.49360: calling self._execute() 22736 1727204242.49458: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204242.49473: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204242.49491: variable 'omit' from source: magic vars 22736 1727204242.49976: variable 'ansible_distribution_major_version' from source: facts 22736 1727204242.50030: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204242.50243: _execute() done 22736 1727204242.50246: dumping result to json 22736 1727204242.50249: done dumping result, returning 22736 1727204242.50251: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-4f4a-548a-000000000104] 22736 1727204242.50253: sending task result for task 12b410aa-8751-4f4a-548a-000000000104 22736 1727204242.50326: done sending task result for task 12b410aa-8751-4f4a-548a-000000000104 22736 1727204242.50329: WORKER PROCESS EXITING 22736 1727204242.50522: no more pending results, returning what we have 22736 1727204242.50528: in VariableManager get_vars() 22736 1727204242.50564: Calling all_inventory to load vars for managed-node2 22736 1727204242.50568: Calling groups_inventory to load vars for managed-node2 22736 1727204242.50573: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204242.50593: Calling all_plugins_play to load vars for managed-node2 22736 1727204242.50598: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204242.50602: Calling groups_plugins_play to load vars for managed-node2 22736 1727204242.51239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204242.52156: done with get_vars() 22736 1727204242.52164: variable 'ansible_search_path' from source: unknown 22736 1727204242.52165: variable 'ansible_search_path' from source: unknown 22736 1727204242.52215: we have included files to process 22736 1727204242.52217: generating all_blocks data 22736 1727204242.52219: done generating all_blocks data 22736 1727204242.52220: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22736 1727204242.52222: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22736 1727204242.52224: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22736 1727204242.52659: done processing included file 22736 1727204242.52662: iterating over new_blocks loaded from include file 22736 1727204242.52664: in VariableManager get_vars() 22736 1727204242.52678: done with get_vars() 22736 1727204242.52680: filtering new block on tags 22736 1727204242.52705: done filtering new block on tags 22736 1727204242.52708: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 22736 1727204242.52713: extending task lists for all hosts with included blocks 22736 1727204242.52876: done extending task lists 22736 1727204242.52878: done processing included files 22736 1727204242.52879: results queue empty 22736 1727204242.52880: checking for any_errors_fatal 22736 1727204242.52883: done checking for any_errors_fatal 22736 1727204242.52884: checking for max_fail_percentage 22736 1727204242.52885: done checking for max_fail_percentage 22736 1727204242.52886: checking to see if all hosts have failed and the running result is not ok 22736 1727204242.52887: done checking to see if all hosts have failed 22736 1727204242.52888: getting the remaining hosts for this loop 22736 1727204242.52891: done getting the remaining hosts for this loop 22736 1727204242.52894: getting the next task for host managed-node2 22736 1727204242.52899: done getting next task for host managed-node2 22736 1727204242.52902: ^ task is: TASK: Gather current interface info 22736 1727204242.52905: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204242.52907: getting variables 22736 1727204242.52909: in VariableManager get_vars() 22736 1727204242.52922: Calling all_inventory to load vars for managed-node2 22736 1727204242.52925: Calling groups_inventory to load vars for managed-node2 22736 1727204242.52928: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204242.52935: Calling all_plugins_play to load vars for managed-node2 22736 1727204242.52938: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204242.52942: Calling groups_plugins_play to load vars for managed-node2 22736 1727204242.53147: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204242.53452: done with get_vars() 22736 1727204242.53462: done getting variables 22736 1727204242.53516: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:57:22 -0400 (0:00:00.049) 0:00:07.320 ***** 22736 1727204242.53556: entering _queue_task() for managed-node2/command 22736 1727204242.53879: worker is 1 (out of 1 available) 22736 1727204242.53895: exiting _queue_task() for managed-node2/command 22736 1727204242.53911: done queuing things up, now waiting for results queue to drain 22736 1727204242.53915: waiting for pending results... 22736 1727204242.54461: running TaskExecutor() for managed-node2/TASK: Gather current interface info 22736 1727204242.54617: in run() - task 12b410aa-8751-4f4a-548a-000000000115 22736 1727204242.54637: variable 'ansible_search_path' from source: unknown 22736 1727204242.54655: variable 'ansible_search_path' from source: unknown 22736 1727204242.54701: calling self._execute() 22736 1727204242.54825: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204242.54829: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204242.54842: variable 'omit' from source: magic vars 22736 1727204242.55646: variable 'ansible_distribution_major_version' from source: facts 22736 1727204242.55650: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204242.55653: variable 'omit' from source: magic vars 22736 1727204242.55824: variable 'omit' from source: magic vars 22736 1727204242.55835: variable 'omit' from source: magic vars 22736 1727204242.55900: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204242.55950: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204242.56099: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204242.56126: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204242.56142: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204242.56202: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204242.56212: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204242.56221: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204242.56497: Set connection var ansible_timeout to 10 22736 1727204242.56501: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204242.56503: Set connection var ansible_shell_executable to /bin/sh 22736 1727204242.56506: Set connection var ansible_shell_type to sh 22736 1727204242.56508: Set connection var ansible_pipelining to False 22736 1727204242.56517: Set connection var ansible_connection to ssh 22736 1727204242.56737: variable 'ansible_shell_executable' from source: unknown 22736 1727204242.56743: variable 'ansible_connection' from source: unknown 22736 1727204242.56747: variable 'ansible_module_compression' from source: unknown 22736 1727204242.56749: variable 'ansible_shell_type' from source: unknown 22736 1727204242.56751: variable 'ansible_shell_executable' from source: unknown 22736 1727204242.56754: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204242.56756: variable 'ansible_pipelining' from source: unknown 22736 1727204242.56758: variable 'ansible_timeout' from source: unknown 22736 1727204242.56760: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204242.57107: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204242.57111: variable 'omit' from source: magic vars 22736 1727204242.57123: starting attempt loop 22736 1727204242.57210: running the handler 22736 1727204242.57213: _low_level_execute_command(): starting 22736 1727204242.57216: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204242.58772: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204242.58793: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204242.58812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204242.58841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204242.58869: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204242.58986: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204242.59208: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204242.59325: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204242.61115: stdout chunk (state=3): >>>/root <<< 22736 1727204242.61228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204242.61278: stderr chunk (state=3): >>><<< 22736 1727204242.61281: stdout chunk (state=3): >>><<< 22736 1727204242.61308: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204242.61324: _low_level_execute_command(): starting 22736 1727204242.61330: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204242.613079-23262-159843665917812 `" && echo ansible-tmp-1727204242.613079-23262-159843665917812="` echo /root/.ansible/tmp/ansible-tmp-1727204242.613079-23262-159843665917812 `" ) && sleep 0' 22736 1727204242.61794: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204242.61798: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204242.61802: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204242.61810: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204242.61853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204242.61856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204242.61901: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204242.63948: stdout chunk (state=3): >>>ansible-tmp-1727204242.613079-23262-159843665917812=/root/.ansible/tmp/ansible-tmp-1727204242.613079-23262-159843665917812 <<< 22736 1727204242.64067: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204242.64114: stderr chunk (state=3): >>><<< 22736 1727204242.64120: stdout chunk (state=3): >>><<< 22736 1727204242.64136: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204242.613079-23262-159843665917812=/root/.ansible/tmp/ansible-tmp-1727204242.613079-23262-159843665917812 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204242.64165: variable 'ansible_module_compression' from source: unknown 22736 1727204242.64215: ANSIBALLZ: Using generic lock for ansible.legacy.command 22736 1727204242.64220: ANSIBALLZ: Acquiring lock 22736 1727204242.64224: ANSIBALLZ: Lock acquired: 140553536881728 22736 1727204242.64230: ANSIBALLZ: Creating module 22736 1727204242.74888: ANSIBALLZ: Writing module into payload 22736 1727204242.74971: ANSIBALLZ: Writing module 22736 1727204242.74993: ANSIBALLZ: Renaming module 22736 1727204242.74998: ANSIBALLZ: Done creating module 22736 1727204242.75015: variable 'ansible_facts' from source: unknown 22736 1727204242.75064: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204242.613079-23262-159843665917812/AnsiballZ_command.py 22736 1727204242.75185: Sending initial data 22736 1727204242.75188: Sent initial data (155 bytes) 22736 1727204242.75680: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204242.75683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204242.75686: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204242.75690: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204242.75756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204242.75761: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204242.75766: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204242.75804: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204242.77569: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204242.77607: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204242.77641: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpffsjke1r /root/.ansible/tmp/ansible-tmp-1727204242.613079-23262-159843665917812/AnsiballZ_command.py <<< 22736 1727204242.77649: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204242.613079-23262-159843665917812/AnsiballZ_command.py" <<< 22736 1727204242.77677: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpffsjke1r" to remote "/root/.ansible/tmp/ansible-tmp-1727204242.613079-23262-159843665917812/AnsiballZ_command.py" <<< 22736 1727204242.77684: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204242.613079-23262-159843665917812/AnsiballZ_command.py" <<< 22736 1727204242.78448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204242.78520: stderr chunk (state=3): >>><<< 22736 1727204242.78523: stdout chunk (state=3): >>><<< 22736 1727204242.78544: done transferring module to remote 22736 1727204242.78555: _low_level_execute_command(): starting 22736 1727204242.78561: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204242.613079-23262-159843665917812/ /root/.ansible/tmp/ansible-tmp-1727204242.613079-23262-159843665917812/AnsiballZ_command.py && sleep 0' 22736 1727204242.79025: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204242.79029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204242.79033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204242.79035: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204242.79095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204242.79098: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204242.79137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204242.81085: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204242.81133: stderr chunk (state=3): >>><<< 22736 1727204242.81136: stdout chunk (state=3): >>><<< 22736 1727204242.81152: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204242.81155: _low_level_execute_command(): starting 22736 1727204242.81161: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204242.613079-23262-159843665917812/AnsiballZ_command.py && sleep 0' 22736 1727204242.81613: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204242.81617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204242.81620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204242.81622: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204242.81624: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204242.81678: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204242.81682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204242.81729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204242.99551: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:22.990889", "end": "2024-09-24 14:57:22.994623", "delta": "0:00:00.003734", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22736 1727204243.01238: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204243.01305: stderr chunk (state=3): >>><<< 22736 1727204243.01309: stdout chunk (state=3): >>><<< 22736 1727204243.01328: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:22.990889", "end": "2024-09-24 14:57:22.994623", "delta": "0:00:00.003734", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204243.01365: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204242.613079-23262-159843665917812/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204243.01374: _low_level_execute_command(): starting 22736 1727204243.01379: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204242.613079-23262-159843665917812/ > /dev/null 2>&1 && sleep 0' 22736 1727204243.01881: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204243.01885: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204243.01887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204243.01893: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204243.01895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204243.01955: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204243.01961: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204243.02003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204243.03962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204243.04020: stderr chunk (state=3): >>><<< 22736 1727204243.04023: stdout chunk (state=3): >>><<< 22736 1727204243.04038: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204243.04047: handler run complete 22736 1727204243.04072: Evaluated conditional (False): False 22736 1727204243.04086: attempt loop complete, returning result 22736 1727204243.04091: _execute() done 22736 1727204243.04096: dumping result to json 22736 1727204243.04102: done dumping result, returning 22736 1727204243.04111: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [12b410aa-8751-4f4a-548a-000000000115] 22736 1727204243.04118: sending task result for task 12b410aa-8751-4f4a-548a-000000000115 22736 1727204243.04227: done sending task result for task 12b410aa-8751-4f4a-548a-000000000115 22736 1727204243.04231: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003734", "end": "2024-09-24 14:57:22.994623", "rc": 0, "start": "2024-09-24 14:57:22.990889" } STDOUT: bonding_masters eth0 lo 22736 1727204243.04407: no more pending results, returning what we have 22736 1727204243.04410: results queue empty 22736 1727204243.04411: checking for any_errors_fatal 22736 1727204243.04413: done checking for any_errors_fatal 22736 1727204243.04414: checking for max_fail_percentage 22736 1727204243.04415: done checking for max_fail_percentage 22736 1727204243.04416: checking to see if all hosts have failed and the running result is not ok 22736 1727204243.04417: done checking to see if all hosts have failed 22736 1727204243.04418: getting the remaining hosts for this loop 22736 1727204243.04420: done getting the remaining hosts for this loop 22736 1727204243.04424: getting the next task for host managed-node2 22736 1727204243.04430: done getting next task for host managed-node2 22736 1727204243.04433: ^ task is: TASK: Set current_interfaces 22736 1727204243.04437: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204243.04440: getting variables 22736 1727204243.04441: in VariableManager get_vars() 22736 1727204243.04472: Calling all_inventory to load vars for managed-node2 22736 1727204243.04475: Calling groups_inventory to load vars for managed-node2 22736 1727204243.04479: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204243.04493: Calling all_plugins_play to load vars for managed-node2 22736 1727204243.04496: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204243.04500: Calling groups_plugins_play to load vars for managed-node2 22736 1727204243.04649: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204243.04809: done with get_vars() 22736 1727204243.04819: done getting variables 22736 1727204243.04866: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.513) 0:00:07.833 ***** 22736 1727204243.04893: entering _queue_task() for managed-node2/set_fact 22736 1727204243.05107: worker is 1 (out of 1 available) 22736 1727204243.05120: exiting _queue_task() for managed-node2/set_fact 22736 1727204243.05132: done queuing things up, now waiting for results queue to drain 22736 1727204243.05134: waiting for pending results... 22736 1727204243.05298: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 22736 1727204243.05374: in run() - task 12b410aa-8751-4f4a-548a-000000000116 22736 1727204243.05386: variable 'ansible_search_path' from source: unknown 22736 1727204243.05390: variable 'ansible_search_path' from source: unknown 22736 1727204243.05426: calling self._execute() 22736 1727204243.05495: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.05503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.05513: variable 'omit' from source: magic vars 22736 1727204243.05828: variable 'ansible_distribution_major_version' from source: facts 22736 1727204243.05838: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204243.05846: variable 'omit' from source: magic vars 22736 1727204243.05891: variable 'omit' from source: magic vars 22736 1727204243.05982: variable '_current_interfaces' from source: set_fact 22736 1727204243.06041: variable 'omit' from source: magic vars 22736 1727204243.06075: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204243.06108: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204243.06133: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204243.06150: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204243.06161: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204243.06190: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204243.06194: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.06199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.06285: Set connection var ansible_timeout to 10 22736 1727204243.06298: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204243.06306: Set connection var ansible_shell_executable to /bin/sh 22736 1727204243.06309: Set connection var ansible_shell_type to sh 22736 1727204243.06319: Set connection var ansible_pipelining to False 22736 1727204243.06322: Set connection var ansible_connection to ssh 22736 1727204243.06343: variable 'ansible_shell_executable' from source: unknown 22736 1727204243.06348: variable 'ansible_connection' from source: unknown 22736 1727204243.06352: variable 'ansible_module_compression' from source: unknown 22736 1727204243.06355: variable 'ansible_shell_type' from source: unknown 22736 1727204243.06357: variable 'ansible_shell_executable' from source: unknown 22736 1727204243.06363: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.06366: variable 'ansible_pipelining' from source: unknown 22736 1727204243.06371: variable 'ansible_timeout' from source: unknown 22736 1727204243.06376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.06499: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204243.06509: variable 'omit' from source: magic vars 22736 1727204243.06514: starting attempt loop 22736 1727204243.06521: running the handler 22736 1727204243.06532: handler run complete 22736 1727204243.06541: attempt loop complete, returning result 22736 1727204243.06544: _execute() done 22736 1727204243.06547: dumping result to json 22736 1727204243.06554: done dumping result, returning 22736 1727204243.06563: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [12b410aa-8751-4f4a-548a-000000000116] 22736 1727204243.06566: sending task result for task 12b410aa-8751-4f4a-548a-000000000116 22736 1727204243.06653: done sending task result for task 12b410aa-8751-4f4a-548a-000000000116 22736 1727204243.06656: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 22736 1727204243.06731: no more pending results, returning what we have 22736 1727204243.06734: results queue empty 22736 1727204243.06735: checking for any_errors_fatal 22736 1727204243.06743: done checking for any_errors_fatal 22736 1727204243.06745: checking for max_fail_percentage 22736 1727204243.06746: done checking for max_fail_percentage 22736 1727204243.06747: checking to see if all hosts have failed and the running result is not ok 22736 1727204243.06748: done checking to see if all hosts have failed 22736 1727204243.06749: getting the remaining hosts for this loop 22736 1727204243.06751: done getting the remaining hosts for this loop 22736 1727204243.06755: getting the next task for host managed-node2 22736 1727204243.06763: done getting next task for host managed-node2 22736 1727204243.06766: ^ task is: TASK: Show current_interfaces 22736 1727204243.06769: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204243.06774: getting variables 22736 1727204243.06776: in VariableManager get_vars() 22736 1727204243.06805: Calling all_inventory to load vars for managed-node2 22736 1727204243.06808: Calling groups_inventory to load vars for managed-node2 22736 1727204243.06811: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204243.06822: Calling all_plugins_play to load vars for managed-node2 22736 1727204243.06825: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204243.06829: Calling groups_plugins_play to load vars for managed-node2 22736 1727204243.06974: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204243.07154: done with get_vars() 22736 1727204243.07162: done getting variables 22736 1727204243.07211: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.023) 0:00:07.857 ***** 22736 1727204243.07235: entering _queue_task() for managed-node2/debug 22736 1727204243.07440: worker is 1 (out of 1 available) 22736 1727204243.07454: exiting _queue_task() for managed-node2/debug 22736 1727204243.07467: done queuing things up, now waiting for results queue to drain 22736 1727204243.07468: waiting for pending results... 22736 1727204243.07627: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 22736 1727204243.07702: in run() - task 12b410aa-8751-4f4a-548a-000000000105 22736 1727204243.07712: variable 'ansible_search_path' from source: unknown 22736 1727204243.07716: variable 'ansible_search_path' from source: unknown 22736 1727204243.07750: calling self._execute() 22736 1727204243.07818: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.07827: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.07837: variable 'omit' from source: magic vars 22736 1727204243.08155: variable 'ansible_distribution_major_version' from source: facts 22736 1727204243.08166: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204243.08173: variable 'omit' from source: magic vars 22736 1727204243.08209: variable 'omit' from source: magic vars 22736 1727204243.08294: variable 'current_interfaces' from source: set_fact 22736 1727204243.08316: variable 'omit' from source: magic vars 22736 1727204243.08351: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204243.08392: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204243.08410: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204243.08427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204243.08439: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204243.08470: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204243.08473: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.08478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.08561: Set connection var ansible_timeout to 10 22736 1727204243.08575: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204243.08585: Set connection var ansible_shell_executable to /bin/sh 22736 1727204243.08588: Set connection var ansible_shell_type to sh 22736 1727204243.08597: Set connection var ansible_pipelining to False 22736 1727204243.08599: Set connection var ansible_connection to ssh 22736 1727204243.08619: variable 'ansible_shell_executable' from source: unknown 22736 1727204243.08623: variable 'ansible_connection' from source: unknown 22736 1727204243.08625: variable 'ansible_module_compression' from source: unknown 22736 1727204243.08628: variable 'ansible_shell_type' from source: unknown 22736 1727204243.08633: variable 'ansible_shell_executable' from source: unknown 22736 1727204243.08637: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.08642: variable 'ansible_pipelining' from source: unknown 22736 1727204243.08646: variable 'ansible_timeout' from source: unknown 22736 1727204243.08651: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.08770: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204243.08780: variable 'omit' from source: magic vars 22736 1727204243.08791: starting attempt loop 22736 1727204243.08796: running the handler 22736 1727204243.08834: handler run complete 22736 1727204243.08848: attempt loop complete, returning result 22736 1727204243.08851: _execute() done 22736 1727204243.08854: dumping result to json 22736 1727204243.08859: done dumping result, returning 22736 1727204243.08866: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [12b410aa-8751-4f4a-548a-000000000105] 22736 1727204243.08871: sending task result for task 12b410aa-8751-4f4a-548a-000000000105 22736 1727204243.08964: done sending task result for task 12b410aa-8751-4f4a-548a-000000000105 22736 1727204243.08968: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 22736 1727204243.09045: no more pending results, returning what we have 22736 1727204243.09049: results queue empty 22736 1727204243.09050: checking for any_errors_fatal 22736 1727204243.09054: done checking for any_errors_fatal 22736 1727204243.09055: checking for max_fail_percentage 22736 1727204243.09056: done checking for max_fail_percentage 22736 1727204243.09057: checking to see if all hosts have failed and the running result is not ok 22736 1727204243.09058: done checking to see if all hosts have failed 22736 1727204243.09059: getting the remaining hosts for this loop 22736 1727204243.09060: done getting the remaining hosts for this loop 22736 1727204243.09064: getting the next task for host managed-node2 22736 1727204243.09071: done getting next task for host managed-node2 22736 1727204243.09074: ^ task is: TASK: Include the task 'manage_test_interface.yml' 22736 1727204243.09076: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204243.09081: getting variables 22736 1727204243.09083: in VariableManager get_vars() 22736 1727204243.09106: Calling all_inventory to load vars for managed-node2 22736 1727204243.09110: Calling groups_inventory to load vars for managed-node2 22736 1727204243.09115: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204243.09124: Calling all_plugins_play to load vars for managed-node2 22736 1727204243.09126: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204243.09128: Calling groups_plugins_play to load vars for managed-node2 22736 1727204243.09265: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204243.09424: done with get_vars() 22736 1727204243.09432: done getting variables TASK [Include the task 'manage_test_interface.yml'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:26 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.022) 0:00:07.879 ***** 22736 1727204243.09502: entering _queue_task() for managed-node2/include_tasks 22736 1727204243.09699: worker is 1 (out of 1 available) 22736 1727204243.09717: exiting _queue_task() for managed-node2/include_tasks 22736 1727204243.09729: done queuing things up, now waiting for results queue to drain 22736 1727204243.09731: waiting for pending results... 22736 1727204243.09878: running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' 22736 1727204243.09941: in run() - task 12b410aa-8751-4f4a-548a-000000000011 22736 1727204243.09960: variable 'ansible_search_path' from source: unknown 22736 1727204243.09988: calling self._execute() 22736 1727204243.10054: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.10067: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.10077: variable 'omit' from source: magic vars 22736 1727204243.10370: variable 'ansible_distribution_major_version' from source: facts 22736 1727204243.10380: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204243.10389: _execute() done 22736 1727204243.10392: dumping result to json 22736 1727204243.10404: done dumping result, returning 22736 1727204243.10408: done running TaskExecutor() for managed-node2/TASK: Include the task 'manage_test_interface.yml' [12b410aa-8751-4f4a-548a-000000000011] 22736 1727204243.10410: sending task result for task 12b410aa-8751-4f4a-548a-000000000011 22736 1727204243.10507: done sending task result for task 12b410aa-8751-4f4a-548a-000000000011 22736 1727204243.10510: WORKER PROCESS EXITING 22736 1727204243.10541: no more pending results, returning what we have 22736 1727204243.10546: in VariableManager get_vars() 22736 1727204243.10575: Calling all_inventory to load vars for managed-node2 22736 1727204243.10578: Calling groups_inventory to load vars for managed-node2 22736 1727204243.10581: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204243.10600: Calling all_plugins_play to load vars for managed-node2 22736 1727204243.10604: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204243.10608: Calling groups_plugins_play to load vars for managed-node2 22736 1727204243.10785: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204243.10941: done with get_vars() 22736 1727204243.10947: variable 'ansible_search_path' from source: unknown 22736 1727204243.10957: we have included files to process 22736 1727204243.10957: generating all_blocks data 22736 1727204243.10958: done generating all_blocks data 22736 1727204243.10962: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22736 1727204243.10963: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22736 1727204243.10964: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml 22736 1727204243.11383: in VariableManager get_vars() 22736 1727204243.11398: done with get_vars() 22736 1727204243.11572: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 22736 1727204243.12063: done processing included file 22736 1727204243.12065: iterating over new_blocks loaded from include file 22736 1727204243.12066: in VariableManager get_vars() 22736 1727204243.12075: done with get_vars() 22736 1727204243.12076: filtering new block on tags 22736 1727204243.12102: done filtering new block on tags 22736 1727204243.12104: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml for managed-node2 22736 1727204243.12108: extending task lists for all hosts with included blocks 22736 1727204243.12252: done extending task lists 22736 1727204243.12253: done processing included files 22736 1727204243.12253: results queue empty 22736 1727204243.12254: checking for any_errors_fatal 22736 1727204243.12256: done checking for any_errors_fatal 22736 1727204243.12256: checking for max_fail_percentage 22736 1727204243.12257: done checking for max_fail_percentage 22736 1727204243.12258: checking to see if all hosts have failed and the running result is not ok 22736 1727204243.12258: done checking to see if all hosts have failed 22736 1727204243.12259: getting the remaining hosts for this loop 22736 1727204243.12260: done getting the remaining hosts for this loop 22736 1727204243.12262: getting the next task for host managed-node2 22736 1727204243.12264: done getting next task for host managed-node2 22736 1727204243.12266: ^ task is: TASK: Ensure state in ["present", "absent"] 22736 1727204243.12268: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204243.12269: getting variables 22736 1727204243.12270: in VariableManager get_vars() 22736 1727204243.12276: Calling all_inventory to load vars for managed-node2 22736 1727204243.12278: Calling groups_inventory to load vars for managed-node2 22736 1727204243.12280: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204243.12283: Calling all_plugins_play to load vars for managed-node2 22736 1727204243.12285: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204243.12287: Calling groups_plugins_play to load vars for managed-node2 22736 1727204243.12400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204243.12550: done with get_vars() 22736 1727204243.12558: done getting variables 22736 1727204243.12610: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Ensure state in ["present", "absent"]] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:3 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.031) 0:00:07.911 ***** 22736 1727204243.12633: entering _queue_task() for managed-node2/fail 22736 1727204243.12634: Creating lock for fail 22736 1727204243.12845: worker is 1 (out of 1 available) 22736 1727204243.12859: exiting _queue_task() for managed-node2/fail 22736 1727204243.12871: done queuing things up, now waiting for results queue to drain 22736 1727204243.12872: waiting for pending results... 22736 1727204243.13023: running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] 22736 1727204243.13096: in run() - task 12b410aa-8751-4f4a-548a-000000000131 22736 1727204243.13111: variable 'ansible_search_path' from source: unknown 22736 1727204243.13117: variable 'ansible_search_path' from source: unknown 22736 1727204243.13140: calling self._execute() 22736 1727204243.13206: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.13214: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.13226: variable 'omit' from source: magic vars 22736 1727204243.13555: variable 'ansible_distribution_major_version' from source: facts 22736 1727204243.13567: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204243.13685: variable 'state' from source: include params 22736 1727204243.13692: Evaluated conditional (state not in ["present", "absent"]): False 22736 1727204243.13696: when evaluation is False, skipping this task 22736 1727204243.13700: _execute() done 22736 1727204243.13705: dumping result to json 22736 1727204243.13709: done dumping result, returning 22736 1727204243.13718: done running TaskExecutor() for managed-node2/TASK: Ensure state in ["present", "absent"] [12b410aa-8751-4f4a-548a-000000000131] 22736 1727204243.13722: sending task result for task 12b410aa-8751-4f4a-548a-000000000131 22736 1727204243.13810: done sending task result for task 12b410aa-8751-4f4a-548a-000000000131 22736 1727204243.13816: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "state not in [\"present\", \"absent\"]", "skip_reason": "Conditional result was False" } 22736 1727204243.13868: no more pending results, returning what we have 22736 1727204243.13872: results queue empty 22736 1727204243.13873: checking for any_errors_fatal 22736 1727204243.13875: done checking for any_errors_fatal 22736 1727204243.13876: checking for max_fail_percentage 22736 1727204243.13877: done checking for max_fail_percentage 22736 1727204243.13878: checking to see if all hosts have failed and the running result is not ok 22736 1727204243.13879: done checking to see if all hosts have failed 22736 1727204243.13880: getting the remaining hosts for this loop 22736 1727204243.13882: done getting the remaining hosts for this loop 22736 1727204243.13885: getting the next task for host managed-node2 22736 1727204243.13894: done getting next task for host managed-node2 22736 1727204243.13897: ^ task is: TASK: Ensure type in ["dummy", "tap", "veth"] 22736 1727204243.13900: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204243.13903: getting variables 22736 1727204243.13905: in VariableManager get_vars() 22736 1727204243.13933: Calling all_inventory to load vars for managed-node2 22736 1727204243.13936: Calling groups_inventory to load vars for managed-node2 22736 1727204243.13940: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204243.13951: Calling all_plugins_play to load vars for managed-node2 22736 1727204243.13954: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204243.13962: Calling groups_plugins_play to load vars for managed-node2 22736 1727204243.14128: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204243.14283: done with get_vars() 22736 1727204243.14294: done getting variables 22736 1727204243.14340: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Ensure type in ["dummy", "tap", "veth"]] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:8 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.017) 0:00:07.928 ***** 22736 1727204243.14362: entering _queue_task() for managed-node2/fail 22736 1727204243.14569: worker is 1 (out of 1 available) 22736 1727204243.14582: exiting _queue_task() for managed-node2/fail 22736 1727204243.14596: done queuing things up, now waiting for results queue to drain 22736 1727204243.14598: waiting for pending results... 22736 1727204243.14747: running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] 22736 1727204243.14819: in run() - task 12b410aa-8751-4f4a-548a-000000000132 22736 1727204243.14831: variable 'ansible_search_path' from source: unknown 22736 1727204243.14838: variable 'ansible_search_path' from source: unknown 22736 1727204243.14869: calling self._execute() 22736 1727204243.14948: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.14951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.14961: variable 'omit' from source: magic vars 22736 1727204243.15275: variable 'ansible_distribution_major_version' from source: facts 22736 1727204243.15287: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204243.15411: variable 'type' from source: set_fact 22736 1727204243.15419: Evaluated conditional (type not in ["dummy", "tap", "veth"]): False 22736 1727204243.15422: when evaluation is False, skipping this task 22736 1727204243.15428: _execute() done 22736 1727204243.15431: dumping result to json 22736 1727204243.15436: done dumping result, returning 22736 1727204243.15442: done running TaskExecutor() for managed-node2/TASK: Ensure type in ["dummy", "tap", "veth"] [12b410aa-8751-4f4a-548a-000000000132] 22736 1727204243.15448: sending task result for task 12b410aa-8751-4f4a-548a-000000000132 22736 1727204243.15536: done sending task result for task 12b410aa-8751-4f4a-548a-000000000132 22736 1727204243.15540: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type not in [\"dummy\", \"tap\", \"veth\"]", "skip_reason": "Conditional result was False" } 22736 1727204243.15593: no more pending results, returning what we have 22736 1727204243.15597: results queue empty 22736 1727204243.15598: checking for any_errors_fatal 22736 1727204243.15603: done checking for any_errors_fatal 22736 1727204243.15604: checking for max_fail_percentage 22736 1727204243.15606: done checking for max_fail_percentage 22736 1727204243.15607: checking to see if all hosts have failed and the running result is not ok 22736 1727204243.15608: done checking to see if all hosts have failed 22736 1727204243.15609: getting the remaining hosts for this loop 22736 1727204243.15610: done getting the remaining hosts for this loop 22736 1727204243.15614: getting the next task for host managed-node2 22736 1727204243.15621: done getting next task for host managed-node2 22736 1727204243.15623: ^ task is: TASK: Include the task 'show_interfaces.yml' 22736 1727204243.15627: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204243.15630: getting variables 22736 1727204243.15631: in VariableManager get_vars() 22736 1727204243.15655: Calling all_inventory to load vars for managed-node2 22736 1727204243.15658: Calling groups_inventory to load vars for managed-node2 22736 1727204243.15661: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204243.15671: Calling all_plugins_play to load vars for managed-node2 22736 1727204243.15674: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204243.15677: Calling groups_plugins_play to load vars for managed-node2 22736 1727204243.15815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204243.15987: done with get_vars() 22736 1727204243.15996: done getting variables TASK [Include the task 'show_interfaces.yml'] ********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:13 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.017) 0:00:07.945 ***** 22736 1727204243.16067: entering _queue_task() for managed-node2/include_tasks 22736 1727204243.16255: worker is 1 (out of 1 available) 22736 1727204243.16269: exiting _queue_task() for managed-node2/include_tasks 22736 1727204243.16282: done queuing things up, now waiting for results queue to drain 22736 1727204243.16283: waiting for pending results... 22736 1727204243.16435: running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' 22736 1727204243.16510: in run() - task 12b410aa-8751-4f4a-548a-000000000133 22736 1727204243.16526: variable 'ansible_search_path' from source: unknown 22736 1727204243.16531: variable 'ansible_search_path' from source: unknown 22736 1727204243.16560: calling self._execute() 22736 1727204243.16631: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.16635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.16644: variable 'omit' from source: magic vars 22736 1727204243.16944: variable 'ansible_distribution_major_version' from source: facts 22736 1727204243.16956: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204243.16965: _execute() done 22736 1727204243.16969: dumping result to json 22736 1727204243.16975: done dumping result, returning 22736 1727204243.16980: done running TaskExecutor() for managed-node2/TASK: Include the task 'show_interfaces.yml' [12b410aa-8751-4f4a-548a-000000000133] 22736 1727204243.16987: sending task result for task 12b410aa-8751-4f4a-548a-000000000133 22736 1727204243.17079: done sending task result for task 12b410aa-8751-4f4a-548a-000000000133 22736 1727204243.17082: WORKER PROCESS EXITING 22736 1727204243.17117: no more pending results, returning what we have 22736 1727204243.17121: in VariableManager get_vars() 22736 1727204243.17152: Calling all_inventory to load vars for managed-node2 22736 1727204243.17155: Calling groups_inventory to load vars for managed-node2 22736 1727204243.17158: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204243.17169: Calling all_plugins_play to load vars for managed-node2 22736 1727204243.17171: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204243.17175: Calling groups_plugins_play to load vars for managed-node2 22736 1727204243.17324: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204243.17472: done with get_vars() 22736 1727204243.17479: variable 'ansible_search_path' from source: unknown 22736 1727204243.17479: variable 'ansible_search_path' from source: unknown 22736 1727204243.17508: we have included files to process 22736 1727204243.17509: generating all_blocks data 22736 1727204243.17511: done generating all_blocks data 22736 1727204243.17517: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22736 1727204243.17518: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22736 1727204243.17520: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml 22736 1727204243.17604: in VariableManager get_vars() 22736 1727204243.17620: done with get_vars() 22736 1727204243.17707: done processing included file 22736 1727204243.17709: iterating over new_blocks loaded from include file 22736 1727204243.17711: in VariableManager get_vars() 22736 1727204243.17723: done with get_vars() 22736 1727204243.17724: filtering new block on tags 22736 1727204243.17738: done filtering new block on tags 22736 1727204243.17739: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml for managed-node2 22736 1727204243.17743: extending task lists for all hosts with included blocks 22736 1727204243.18080: done extending task lists 22736 1727204243.18081: done processing included files 22736 1727204243.18082: results queue empty 22736 1727204243.18082: checking for any_errors_fatal 22736 1727204243.18084: done checking for any_errors_fatal 22736 1727204243.18085: checking for max_fail_percentage 22736 1727204243.18086: done checking for max_fail_percentage 22736 1727204243.18086: checking to see if all hosts have failed and the running result is not ok 22736 1727204243.18087: done checking to see if all hosts have failed 22736 1727204243.18088: getting the remaining hosts for this loop 22736 1727204243.18088: done getting the remaining hosts for this loop 22736 1727204243.18092: getting the next task for host managed-node2 22736 1727204243.18095: done getting next task for host managed-node2 22736 1727204243.18097: ^ task is: TASK: Include the task 'get_current_interfaces.yml' 22736 1727204243.18099: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204243.18101: getting variables 22736 1727204243.18102: in VariableManager get_vars() 22736 1727204243.18108: Calling all_inventory to load vars for managed-node2 22736 1727204243.18109: Calling groups_inventory to load vars for managed-node2 22736 1727204243.18135: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204243.18140: Calling all_plugins_play to load vars for managed-node2 22736 1727204243.18142: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204243.18145: Calling groups_plugins_play to load vars for managed-node2 22736 1727204243.18257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204243.18410: done with get_vars() 22736 1727204243.18419: done getting variables TASK [Include the task 'get_current_interfaces.yml'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:3 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.024) 0:00:07.969 ***** 22736 1727204243.18474: entering _queue_task() for managed-node2/include_tasks 22736 1727204243.18670: worker is 1 (out of 1 available) 22736 1727204243.18684: exiting _queue_task() for managed-node2/include_tasks 22736 1727204243.18697: done queuing things up, now waiting for results queue to drain 22736 1727204243.18699: waiting for pending results... 22736 1727204243.18859: running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' 22736 1727204243.18936: in run() - task 12b410aa-8751-4f4a-548a-00000000015c 22736 1727204243.18948: variable 'ansible_search_path' from source: unknown 22736 1727204243.18952: variable 'ansible_search_path' from source: unknown 22736 1727204243.18981: calling self._execute() 22736 1727204243.19049: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.19056: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.19066: variable 'omit' from source: magic vars 22736 1727204243.19367: variable 'ansible_distribution_major_version' from source: facts 22736 1727204243.19376: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204243.19382: _execute() done 22736 1727204243.19387: dumping result to json 22736 1727204243.19393: done dumping result, returning 22736 1727204243.19400: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_current_interfaces.yml' [12b410aa-8751-4f4a-548a-00000000015c] 22736 1727204243.19405: sending task result for task 12b410aa-8751-4f4a-548a-00000000015c 22736 1727204243.19494: done sending task result for task 12b410aa-8751-4f4a-548a-00000000015c 22736 1727204243.19498: WORKER PROCESS EXITING 22736 1727204243.19535: no more pending results, returning what we have 22736 1727204243.19540: in VariableManager get_vars() 22736 1727204243.19570: Calling all_inventory to load vars for managed-node2 22736 1727204243.19573: Calling groups_inventory to load vars for managed-node2 22736 1727204243.19576: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204243.19587: Calling all_plugins_play to load vars for managed-node2 22736 1727204243.19592: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204243.19596: Calling groups_plugins_play to load vars for managed-node2 22736 1727204243.19747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204243.19918: done with get_vars() 22736 1727204243.19924: variable 'ansible_search_path' from source: unknown 22736 1727204243.19925: variable 'ansible_search_path' from source: unknown 22736 1727204243.19973: we have included files to process 22736 1727204243.19974: generating all_blocks data 22736 1727204243.19975: done generating all_blocks data 22736 1727204243.19976: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22736 1727204243.19977: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22736 1727204243.19978: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml 22736 1727204243.20185: done processing included file 22736 1727204243.20186: iterating over new_blocks loaded from include file 22736 1727204243.20188: in VariableManager get_vars() 22736 1727204243.20200: done with get_vars() 22736 1727204243.20201: filtering new block on tags 22736 1727204243.20217: done filtering new block on tags 22736 1727204243.20219: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml for managed-node2 22736 1727204243.20222: extending task lists for all hosts with included blocks 22736 1727204243.20347: done extending task lists 22736 1727204243.20348: done processing included files 22736 1727204243.20349: results queue empty 22736 1727204243.20349: checking for any_errors_fatal 22736 1727204243.20351: done checking for any_errors_fatal 22736 1727204243.20352: checking for max_fail_percentage 22736 1727204243.20353: done checking for max_fail_percentage 22736 1727204243.20353: checking to see if all hosts have failed and the running result is not ok 22736 1727204243.20354: done checking to see if all hosts have failed 22736 1727204243.20354: getting the remaining hosts for this loop 22736 1727204243.20355: done getting the remaining hosts for this loop 22736 1727204243.20357: getting the next task for host managed-node2 22736 1727204243.20361: done getting next task for host managed-node2 22736 1727204243.20362: ^ task is: TASK: Gather current interface info 22736 1727204243.20365: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204243.20367: getting variables 22736 1727204243.20367: in VariableManager get_vars() 22736 1727204243.20374: Calling all_inventory to load vars for managed-node2 22736 1727204243.20377: Calling groups_inventory to load vars for managed-node2 22736 1727204243.20379: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204243.20383: Calling all_plugins_play to load vars for managed-node2 22736 1727204243.20385: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204243.20387: Calling groups_plugins_play to load vars for managed-node2 22736 1727204243.20500: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204243.20650: done with get_vars() 22736 1727204243.20657: done getting variables 22736 1727204243.20687: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gather current interface info] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:3 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.022) 0:00:07.992 ***** 22736 1727204243.20717: entering _queue_task() for managed-node2/command 22736 1727204243.20917: worker is 1 (out of 1 available) 22736 1727204243.20932: exiting _queue_task() for managed-node2/command 22736 1727204243.20945: done queuing things up, now waiting for results queue to drain 22736 1727204243.20947: waiting for pending results... 22736 1727204243.21094: running TaskExecutor() for managed-node2/TASK: Gather current interface info 22736 1727204243.21179: in run() - task 12b410aa-8751-4f4a-548a-000000000193 22736 1727204243.21193: variable 'ansible_search_path' from source: unknown 22736 1727204243.21197: variable 'ansible_search_path' from source: unknown 22736 1727204243.21230: calling self._execute() 22736 1727204243.21301: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.21305: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.21318: variable 'omit' from source: magic vars 22736 1727204243.21668: variable 'ansible_distribution_major_version' from source: facts 22736 1727204243.21679: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204243.21685: variable 'omit' from source: magic vars 22736 1727204243.21738: variable 'omit' from source: magic vars 22736 1727204243.21768: variable 'omit' from source: magic vars 22736 1727204243.21804: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204243.21840: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204243.21858: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204243.21876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204243.21886: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204243.21918: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204243.21921: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.21924: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.22010: Set connection var ansible_timeout to 10 22736 1727204243.22023: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204243.22031: Set connection var ansible_shell_executable to /bin/sh 22736 1727204243.22036: Set connection var ansible_shell_type to sh 22736 1727204243.22041: Set connection var ansible_pipelining to False 22736 1727204243.22044: Set connection var ansible_connection to ssh 22736 1727204243.22067: variable 'ansible_shell_executable' from source: unknown 22736 1727204243.22071: variable 'ansible_connection' from source: unknown 22736 1727204243.22074: variable 'ansible_module_compression' from source: unknown 22736 1727204243.22076: variable 'ansible_shell_type' from source: unknown 22736 1727204243.22080: variable 'ansible_shell_executable' from source: unknown 22736 1727204243.22084: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.22091: variable 'ansible_pipelining' from source: unknown 22736 1727204243.22094: variable 'ansible_timeout' from source: unknown 22736 1727204243.22100: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.22224: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204243.22233: variable 'omit' from source: magic vars 22736 1727204243.22239: starting attempt loop 22736 1727204243.22242: running the handler 22736 1727204243.22259: _low_level_execute_command(): starting 22736 1727204243.22273: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204243.22824: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204243.22828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204243.22832: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204243.22835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204243.22888: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204243.22897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204243.22899: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204243.22943: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204243.24721: stdout chunk (state=3): >>>/root <<< 22736 1727204243.24828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204243.24890: stderr chunk (state=3): >>><<< 22736 1727204243.24893: stdout chunk (state=3): >>><<< 22736 1727204243.24924: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204243.24936: _low_level_execute_command(): starting 22736 1727204243.24945: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204243.2492383-23295-178757096285885 `" && echo ansible-tmp-1727204243.2492383-23295-178757096285885="` echo /root/.ansible/tmp/ansible-tmp-1727204243.2492383-23295-178757096285885 `" ) && sleep 0' 22736 1727204243.25446: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204243.25449: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204243.25452: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204243.25463: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204243.25465: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204243.25513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204243.25516: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204243.25568: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204243.27618: stdout chunk (state=3): >>>ansible-tmp-1727204243.2492383-23295-178757096285885=/root/.ansible/tmp/ansible-tmp-1727204243.2492383-23295-178757096285885 <<< 22736 1727204243.27734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204243.27798: stderr chunk (state=3): >>><<< 22736 1727204243.27802: stdout chunk (state=3): >>><<< 22736 1727204243.27823: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204243.2492383-23295-178757096285885=/root/.ansible/tmp/ansible-tmp-1727204243.2492383-23295-178757096285885 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204243.27858: variable 'ansible_module_compression' from source: unknown 22736 1727204243.27908: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22736 1727204243.27942: variable 'ansible_facts' from source: unknown 22736 1727204243.28012: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204243.2492383-23295-178757096285885/AnsiballZ_command.py 22736 1727204243.28137: Sending initial data 22736 1727204243.28141: Sent initial data (156 bytes) 22736 1727204243.28643: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204243.28646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204243.28649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204243.28651: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204243.28653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204243.28711: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204243.28717: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204243.28755: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204243.30462: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204243.30503: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204243.30551: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpf9fifjaq /root/.ansible/tmp/ansible-tmp-1727204243.2492383-23295-178757096285885/AnsiballZ_command.py <<< 22736 1727204243.30554: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204243.2492383-23295-178757096285885/AnsiballZ_command.py" <<< 22736 1727204243.30601: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpf9fifjaq" to remote "/root/.ansible/tmp/ansible-tmp-1727204243.2492383-23295-178757096285885/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204243.2492383-23295-178757096285885/AnsiballZ_command.py" <<< 22736 1727204243.31683: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204243.31793: stderr chunk (state=3): >>><<< 22736 1727204243.31884: stdout chunk (state=3): >>><<< 22736 1727204243.31887: done transferring module to remote 22736 1727204243.31891: _low_level_execute_command(): starting 22736 1727204243.31894: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204243.2492383-23295-178757096285885/ /root/.ansible/tmp/ansible-tmp-1727204243.2492383-23295-178757096285885/AnsiballZ_command.py && sleep 0' 22736 1727204243.32568: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204243.32605: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204243.32677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204243.32802: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204243.32841: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204243.32847: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204243.32917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204243.34873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204243.34995: stderr chunk (state=3): >>><<< 22736 1727204243.34998: stdout chunk (state=3): >>><<< 22736 1727204243.35001: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204243.35004: _low_level_execute_command(): starting 22736 1727204243.35006: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204243.2492383-23295-178757096285885/AnsiballZ_command.py && sleep 0' 22736 1727204243.35616: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204243.35638: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204243.35653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204243.35712: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204243.35787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204243.35808: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204243.35833: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204243.35903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204243.53819: stdout chunk (state=3): >>> {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:23.533385", "end": "2024-09-24 14:57:23.537056", "delta": "0:00:00.003671", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22736 1727204243.55549: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204243.55553: stdout chunk (state=3): >>><<< 22736 1727204243.55556: stderr chunk (state=3): >>><<< 22736 1727204243.55695: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bonding_masters\neth0\nlo", "stderr": "", "rc": 0, "cmd": ["ls", "-1"], "start": "2024-09-24 14:57:23.533385", "end": "2024-09-24 14:57:23.537056", "delta": "0:00:00.003671", "msg": "", "invocation": {"module_args": {"chdir": "/sys/class/net", "_raw_params": "ls -1", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204243.55699: done with _execute_module (ansible.legacy.command, {'chdir': '/sys/class/net', '_raw_params': 'ls -1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204243.2492383-23295-178757096285885/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204243.55702: _low_level_execute_command(): starting 22736 1727204243.55704: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204243.2492383-23295-178757096285885/ > /dev/null 2>&1 && sleep 0' 22736 1727204243.56341: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204243.56375: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204243.56393: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204243.56492: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204243.56526: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204243.56544: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204243.56565: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204243.56639: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204243.58604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204243.58696: stderr chunk (state=3): >>><<< 22736 1727204243.58708: stdout chunk (state=3): >>><<< 22736 1727204243.58740: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204243.58895: handler run complete 22736 1727204243.58898: Evaluated conditional (False): False 22736 1727204243.58901: attempt loop complete, returning result 22736 1727204243.58903: _execute() done 22736 1727204243.58905: dumping result to json 22736 1727204243.58907: done dumping result, returning 22736 1727204243.58910: done running TaskExecutor() for managed-node2/TASK: Gather current interface info [12b410aa-8751-4f4a-548a-000000000193] 22736 1727204243.58912: sending task result for task 12b410aa-8751-4f4a-548a-000000000193 ok: [managed-node2] => { "changed": false, "cmd": [ "ls", "-1" ], "delta": "0:00:00.003671", "end": "2024-09-24 14:57:23.537056", "rc": 0, "start": "2024-09-24 14:57:23.533385" } STDOUT: bonding_masters eth0 lo 22736 1727204243.59162: no more pending results, returning what we have 22736 1727204243.59167: results queue empty 22736 1727204243.59169: checking for any_errors_fatal 22736 1727204243.59171: done checking for any_errors_fatal 22736 1727204243.59172: checking for max_fail_percentage 22736 1727204243.59174: done checking for max_fail_percentage 22736 1727204243.59175: checking to see if all hosts have failed and the running result is not ok 22736 1727204243.59176: done checking to see if all hosts have failed 22736 1727204243.59177: getting the remaining hosts for this loop 22736 1727204243.59179: done getting the remaining hosts for this loop 22736 1727204243.59184: getting the next task for host managed-node2 22736 1727204243.59245: done getting next task for host managed-node2 22736 1727204243.59249: ^ task is: TASK: Set current_interfaces 22736 1727204243.59255: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204243.59259: getting variables 22736 1727204243.59261: in VariableManager get_vars() 22736 1727204243.59517: Calling all_inventory to load vars for managed-node2 22736 1727204243.59521: Calling groups_inventory to load vars for managed-node2 22736 1727204243.59526: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204243.59542: Calling all_plugins_play to load vars for managed-node2 22736 1727204243.59663: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204243.59670: Calling groups_plugins_play to load vars for managed-node2 22736 1727204243.59919: done sending task result for task 12b410aa-8751-4f4a-548a-000000000193 22736 1727204243.59923: WORKER PROCESS EXITING 22736 1727204243.59952: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204243.60280: done with get_vars() 22736 1727204243.60295: done getting variables 22736 1727204243.60373: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set current_interfaces] ************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_current_interfaces.yml:9 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.396) 0:00:08.389 ***** 22736 1727204243.60415: entering _queue_task() for managed-node2/set_fact 22736 1727204243.60748: worker is 1 (out of 1 available) 22736 1727204243.60765: exiting _queue_task() for managed-node2/set_fact 22736 1727204243.60904: done queuing things up, now waiting for results queue to drain 22736 1727204243.60906: waiting for pending results... 22736 1727204243.61115: running TaskExecutor() for managed-node2/TASK: Set current_interfaces 22736 1727204243.61283: in run() - task 12b410aa-8751-4f4a-548a-000000000194 22736 1727204243.61317: variable 'ansible_search_path' from source: unknown 22736 1727204243.61326: variable 'ansible_search_path' from source: unknown 22736 1727204243.61454: calling self._execute() 22736 1727204243.61800: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.61803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.61806: variable 'omit' from source: magic vars 22736 1727204243.62669: variable 'ansible_distribution_major_version' from source: facts 22736 1727204243.62763: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204243.62776: variable 'omit' from source: magic vars 22736 1727204243.62994: variable 'omit' from source: magic vars 22736 1727204243.63245: variable '_current_interfaces' from source: set_fact 22736 1727204243.63446: variable 'omit' from source: magic vars 22736 1727204243.63502: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204243.63664: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204243.63774: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204243.63777: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204243.63883: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204243.63904: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204243.63956: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.63968: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.64410: Set connection var ansible_timeout to 10 22736 1727204243.64414: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204243.64417: Set connection var ansible_shell_executable to /bin/sh 22736 1727204243.64420: Set connection var ansible_shell_type to sh 22736 1727204243.64422: Set connection var ansible_pipelining to False 22736 1727204243.64425: Set connection var ansible_connection to ssh 22736 1727204243.64427: variable 'ansible_shell_executable' from source: unknown 22736 1727204243.64430: variable 'ansible_connection' from source: unknown 22736 1727204243.64432: variable 'ansible_module_compression' from source: unknown 22736 1727204243.64434: variable 'ansible_shell_type' from source: unknown 22736 1727204243.64436: variable 'ansible_shell_executable' from source: unknown 22736 1727204243.64438: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.64440: variable 'ansible_pipelining' from source: unknown 22736 1727204243.64595: variable 'ansible_timeout' from source: unknown 22736 1727204243.64598: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.64861: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204243.65067: variable 'omit' from source: magic vars 22736 1727204243.65071: starting attempt loop 22736 1727204243.65074: running the handler 22736 1727204243.65076: handler run complete 22736 1727204243.65079: attempt loop complete, returning result 22736 1727204243.65081: _execute() done 22736 1727204243.65083: dumping result to json 22736 1727204243.65086: done dumping result, returning 22736 1727204243.65091: done running TaskExecutor() for managed-node2/TASK: Set current_interfaces [12b410aa-8751-4f4a-548a-000000000194] 22736 1727204243.65094: sending task result for task 12b410aa-8751-4f4a-548a-000000000194 ok: [managed-node2] => { "ansible_facts": { "current_interfaces": [ "bonding_masters", "eth0", "lo" ] }, "changed": false } 22736 1727204243.65360: no more pending results, returning what we have 22736 1727204243.65364: results queue empty 22736 1727204243.65365: checking for any_errors_fatal 22736 1727204243.65374: done checking for any_errors_fatal 22736 1727204243.65375: checking for max_fail_percentage 22736 1727204243.65377: done checking for max_fail_percentage 22736 1727204243.65378: checking to see if all hosts have failed and the running result is not ok 22736 1727204243.65380: done checking to see if all hosts have failed 22736 1727204243.65381: getting the remaining hosts for this loop 22736 1727204243.65383: done getting the remaining hosts for this loop 22736 1727204243.65394: getting the next task for host managed-node2 22736 1727204243.65408: done getting next task for host managed-node2 22736 1727204243.65411: ^ task is: TASK: Show current_interfaces 22736 1727204243.65416: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204243.65422: getting variables 22736 1727204243.65424: in VariableManager get_vars() 22736 1727204243.65459: Calling all_inventory to load vars for managed-node2 22736 1727204243.65463: Calling groups_inventory to load vars for managed-node2 22736 1727204243.65467: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204243.65482: Calling all_plugins_play to load vars for managed-node2 22736 1727204243.65486: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204243.65618: Calling groups_plugins_play to load vars for managed-node2 22736 1727204243.65695: done sending task result for task 12b410aa-8751-4f4a-548a-000000000194 22736 1727204243.65699: WORKER PROCESS EXITING 22736 1727204243.66040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204243.66363: done with get_vars() 22736 1727204243.66381: done getting variables 22736 1727204243.66446: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show current_interfaces] ************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/show_interfaces.yml:5 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.060) 0:00:08.449 ***** 22736 1727204243.66488: entering _queue_task() for managed-node2/debug 22736 1727204243.66762: worker is 1 (out of 1 available) 22736 1727204243.66775: exiting _queue_task() for managed-node2/debug 22736 1727204243.66788: done queuing things up, now waiting for results queue to drain 22736 1727204243.66921: waiting for pending results... 22736 1727204243.67064: running TaskExecutor() for managed-node2/TASK: Show current_interfaces 22736 1727204243.67201: in run() - task 12b410aa-8751-4f4a-548a-00000000015d 22736 1727204243.67222: variable 'ansible_search_path' from source: unknown 22736 1727204243.67230: variable 'ansible_search_path' from source: unknown 22736 1727204243.67278: calling self._execute() 22736 1727204243.67376: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.67390: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.67406: variable 'omit' from source: magic vars 22736 1727204243.67848: variable 'ansible_distribution_major_version' from source: facts 22736 1727204243.67867: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204243.67878: variable 'omit' from source: magic vars 22736 1727204243.67953: variable 'omit' from source: magic vars 22736 1727204243.68086: variable 'current_interfaces' from source: set_fact 22736 1727204243.68131: variable 'omit' from source: magic vars 22736 1727204243.68177: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204243.68224: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204243.68259: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204243.68284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204243.68303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204243.68391: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204243.68395: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.68398: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.68498: Set connection var ansible_timeout to 10 22736 1727204243.68523: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204243.68538: Set connection var ansible_shell_executable to /bin/sh 22736 1727204243.68546: Set connection var ansible_shell_type to sh 22736 1727204243.68593: Set connection var ansible_pipelining to False 22736 1727204243.68596: Set connection var ansible_connection to ssh 22736 1727204243.68599: variable 'ansible_shell_executable' from source: unknown 22736 1727204243.68606: variable 'ansible_connection' from source: unknown 22736 1727204243.68614: variable 'ansible_module_compression' from source: unknown 22736 1727204243.68627: variable 'ansible_shell_type' from source: unknown 22736 1727204243.68634: variable 'ansible_shell_executable' from source: unknown 22736 1727204243.68642: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.68652: variable 'ansible_pipelining' from source: unknown 22736 1727204243.68777: variable 'ansible_timeout' from source: unknown 22736 1727204243.68780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.68846: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204243.68864: variable 'omit' from source: magic vars 22736 1727204243.68874: starting attempt loop 22736 1727204243.68888: running the handler 22736 1727204243.68949: handler run complete 22736 1727204243.68973: attempt loop complete, returning result 22736 1727204243.68980: _execute() done 22736 1727204243.68993: dumping result to json 22736 1727204243.69008: done dumping result, returning 22736 1727204243.69020: done running TaskExecutor() for managed-node2/TASK: Show current_interfaces [12b410aa-8751-4f4a-548a-00000000015d] 22736 1727204243.69029: sending task result for task 12b410aa-8751-4f4a-548a-00000000015d 22736 1727204243.69245: done sending task result for task 12b410aa-8751-4f4a-548a-00000000015d 22736 1727204243.69248: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: current_interfaces: ['bonding_masters', 'eth0', 'lo'] 22736 1727204243.69304: no more pending results, returning what we have 22736 1727204243.69308: results queue empty 22736 1727204243.69309: checking for any_errors_fatal 22736 1727204243.69316: done checking for any_errors_fatal 22736 1727204243.69317: checking for max_fail_percentage 22736 1727204243.69323: done checking for max_fail_percentage 22736 1727204243.69325: checking to see if all hosts have failed and the running result is not ok 22736 1727204243.69326: done checking to see if all hosts have failed 22736 1727204243.69327: getting the remaining hosts for this loop 22736 1727204243.69329: done getting the remaining hosts for this loop 22736 1727204243.69334: getting the next task for host managed-node2 22736 1727204243.69343: done getting next task for host managed-node2 22736 1727204243.69347: ^ task is: TASK: Install iproute 22736 1727204243.69351: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204243.69357: getting variables 22736 1727204243.69359: in VariableManager get_vars() 22736 1727204243.69392: Calling all_inventory to load vars for managed-node2 22736 1727204243.69396: Calling groups_inventory to load vars for managed-node2 22736 1727204243.69401: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204243.69414: Calling all_plugins_play to load vars for managed-node2 22736 1727204243.69418: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204243.69423: Calling groups_plugins_play to load vars for managed-node2 22736 1727204243.69778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204243.70115: done with get_vars() 22736 1727204243.70128: done getting variables 22736 1727204243.70199: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Install iproute] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 Tuesday 24 September 2024 14:57:23 -0400 (0:00:00.037) 0:00:08.487 ***** 22736 1727204243.70237: entering _queue_task() for managed-node2/package 22736 1727204243.70522: worker is 1 (out of 1 available) 22736 1727204243.70543: exiting _queue_task() for managed-node2/package 22736 1727204243.70557: done queuing things up, now waiting for results queue to drain 22736 1727204243.70559: waiting for pending results... 22736 1727204243.70832: running TaskExecutor() for managed-node2/TASK: Install iproute 22736 1727204243.70887: in run() - task 12b410aa-8751-4f4a-548a-000000000134 22736 1727204243.70919: variable 'ansible_search_path' from source: unknown 22736 1727204243.70929: variable 'ansible_search_path' from source: unknown 22736 1727204243.70974: calling self._execute() 22736 1727204243.71075: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.71091: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.71109: variable 'omit' from source: magic vars 22736 1727204243.71569: variable 'ansible_distribution_major_version' from source: facts 22736 1727204243.71678: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204243.71682: variable 'omit' from source: magic vars 22736 1727204243.71685: variable 'omit' from source: magic vars 22736 1727204243.71930: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204243.74872: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204243.74966: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204243.75018: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204243.75062: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204243.75105: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204243.75222: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204243.75263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204243.75306: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204243.75363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204243.75386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204243.75695: variable '__network_is_ostree' from source: set_fact 22736 1727204243.75698: variable 'omit' from source: magic vars 22736 1727204243.75701: variable 'omit' from source: magic vars 22736 1727204243.75704: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204243.75706: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204243.75709: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204243.75712: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204243.75725: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204243.75774: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204243.75783: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.75795: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.75957: Set connection var ansible_timeout to 10 22736 1727204243.75986: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204243.76005: Set connection var ansible_shell_executable to /bin/sh 22736 1727204243.76014: Set connection var ansible_shell_type to sh 22736 1727204243.76027: Set connection var ansible_pipelining to False 22736 1727204243.76034: Set connection var ansible_connection to ssh 22736 1727204243.76072: variable 'ansible_shell_executable' from source: unknown 22736 1727204243.76088: variable 'ansible_connection' from source: unknown 22736 1727204243.76100: variable 'ansible_module_compression' from source: unknown 22736 1727204243.76109: variable 'ansible_shell_type' from source: unknown 22736 1727204243.76126: variable 'ansible_shell_executable' from source: unknown 22736 1727204243.76134: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204243.76156: variable 'ansible_pipelining' from source: unknown 22736 1727204243.76165: variable 'ansible_timeout' from source: unknown 22736 1727204243.76178: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204243.76349: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204243.76370: variable 'omit' from source: magic vars 22736 1727204243.76389: starting attempt loop 22736 1727204243.76392: running the handler 22736 1727204243.76481: variable 'ansible_facts' from source: unknown 22736 1727204243.76485: variable 'ansible_facts' from source: unknown 22736 1727204243.76488: _low_level_execute_command(): starting 22736 1727204243.76493: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204243.77343: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204243.77408: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204243.77492: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204243.77512: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204243.77542: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204243.77618: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204243.79401: stdout chunk (state=3): >>>/root <<< 22736 1727204243.79611: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204243.79615: stdout chunk (state=3): >>><<< 22736 1727204243.79617: stderr chunk (state=3): >>><<< 22736 1727204243.79638: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204243.79757: _low_level_execute_command(): starting 22736 1727204243.79762: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204243.7965238-23317-178046191004516 `" && echo ansible-tmp-1727204243.7965238-23317-178046191004516="` echo /root/.ansible/tmp/ansible-tmp-1727204243.7965238-23317-178046191004516 `" ) && sleep 0' 22736 1727204243.80345: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204243.80349: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204243.80352: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204243.80441: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204243.80479: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204243.82603: stdout chunk (state=3): >>>ansible-tmp-1727204243.7965238-23317-178046191004516=/root/.ansible/tmp/ansible-tmp-1727204243.7965238-23317-178046191004516 <<< 22736 1727204243.82826: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204243.82830: stdout chunk (state=3): >>><<< 22736 1727204243.82833: stderr chunk (state=3): >>><<< 22736 1727204243.82976: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204243.7965238-23317-178046191004516=/root/.ansible/tmp/ansible-tmp-1727204243.7965238-23317-178046191004516 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204243.82979: variable 'ansible_module_compression' from source: unknown 22736 1727204243.82983: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 22736 1727204243.83007: ANSIBALLZ: Acquiring lock 22736 1727204243.83020: ANSIBALLZ: Lock acquired: 140553536881728 22736 1727204243.83030: ANSIBALLZ: Creating module 22736 1727204244.05622: ANSIBALLZ: Writing module into payload 22736 1727204244.05888: ANSIBALLZ: Writing module 22736 1727204244.05918: ANSIBALLZ: Renaming module 22736 1727204244.05921: ANSIBALLZ: Done creating module 22736 1727204244.05989: variable 'ansible_facts' from source: unknown 22736 1727204244.06063: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204243.7965238-23317-178046191004516/AnsiballZ_dnf.py 22736 1727204244.06422: Sending initial data 22736 1727204244.06426: Sent initial data (152 bytes) 22736 1727204244.06919: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204244.06924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204244.06942: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204244.06968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204244.06972: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204244.06975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204244.06977: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204244.07063: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204244.07066: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204244.07158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204244.08955: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204244.08995: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204244.09053: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpce2wlcgb /root/.ansible/tmp/ansible-tmp-1727204243.7965238-23317-178046191004516/AnsiballZ_dnf.py <<< 22736 1727204244.09057: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204243.7965238-23317-178046191004516/AnsiballZ_dnf.py" <<< 22736 1727204244.09157: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpce2wlcgb" to remote "/root/.ansible/tmp/ansible-tmp-1727204243.7965238-23317-178046191004516/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204243.7965238-23317-178046191004516/AnsiballZ_dnf.py" <<< 22736 1727204244.10706: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204244.10826: stderr chunk (state=3): >>><<< 22736 1727204244.10972: stdout chunk (state=3): >>><<< 22736 1727204244.10976: done transferring module to remote 22736 1727204244.10979: _low_level_execute_command(): starting 22736 1727204244.10985: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204243.7965238-23317-178046191004516/ /root/.ansible/tmp/ansible-tmp-1727204243.7965238-23317-178046191004516/AnsiballZ_dnf.py && sleep 0' 22736 1727204244.11524: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204244.11533: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204244.11546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204244.11564: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204244.11578: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204244.11587: stderr chunk (state=3): >>>debug2: match not found <<< 22736 1727204244.11602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204244.11619: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22736 1727204244.11711: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22736 1727204244.11726: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204244.11755: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204244.11834: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204244.13996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204244.14000: stdout chunk (state=3): >>><<< 22736 1727204244.14002: stderr chunk (state=3): >>><<< 22736 1727204244.14005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204244.14008: _low_level_execute_command(): starting 22736 1727204244.14011: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204243.7965238-23317-178046191004516/AnsiballZ_dnf.py && sleep 0' 22736 1727204244.14905: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204244.14909: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204244.14911: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204244.14914: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204244.14916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204244.14977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204244.14980: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204244.15004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204244.15100: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204245.65915: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 22736 1727204245.71228: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204245.71312: stderr chunk (state=3): >>><<< 22736 1727204245.71334: stdout chunk (state=3): >>><<< 22736 1727204245.71495: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["iproute"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204245.71504: done with _execute_module (ansible.legacy.dnf, {'name': 'iproute', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204243.7965238-23317-178046191004516/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204245.71508: _low_level_execute_command(): starting 22736 1727204245.71510: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204243.7965238-23317-178046191004516/ > /dev/null 2>&1 && sleep 0' 22736 1727204245.72153: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204245.72175: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204245.72311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22736 1727204245.72336: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204245.72357: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204245.72445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204245.74425: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204245.74473: stderr chunk (state=3): >>><<< 22736 1727204245.74480: stdout chunk (state=3): >>><<< 22736 1727204245.74499: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204245.74506: handler run complete 22736 1727204245.74647: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204245.74799: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204245.74834: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204245.74861: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204245.74886: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204245.74951: variable '__install_status' from source: unknown 22736 1727204245.74969: Evaluated conditional (__install_status is success): True 22736 1727204245.74985: attempt loop complete, returning result 22736 1727204245.74988: _execute() done 22736 1727204245.74993: dumping result to json 22736 1727204245.75000: done dumping result, returning 22736 1727204245.75018: done running TaskExecutor() for managed-node2/TASK: Install iproute [12b410aa-8751-4f4a-548a-000000000134] 22736 1727204245.75022: sending task result for task 12b410aa-8751-4f4a-548a-000000000134 22736 1727204245.75116: done sending task result for task 12b410aa-8751-4f4a-548a-000000000134 22736 1727204245.75121: WORKER PROCESS EXITING ok: [managed-node2] => { "attempts": 1, "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 22736 1727204245.75241: no more pending results, returning what we have 22736 1727204245.75245: results queue empty 22736 1727204245.75247: checking for any_errors_fatal 22736 1727204245.75252: done checking for any_errors_fatal 22736 1727204245.75253: checking for max_fail_percentage 22736 1727204245.75255: done checking for max_fail_percentage 22736 1727204245.75256: checking to see if all hosts have failed and the running result is not ok 22736 1727204245.75257: done checking to see if all hosts have failed 22736 1727204245.75258: getting the remaining hosts for this loop 22736 1727204245.75260: done getting the remaining hosts for this loop 22736 1727204245.75264: getting the next task for host managed-node2 22736 1727204245.75271: done getting next task for host managed-node2 22736 1727204245.75274: ^ task is: TASK: Create veth interface {{ interface }} 22736 1727204245.75277: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204245.75281: getting variables 22736 1727204245.75282: in VariableManager get_vars() 22736 1727204245.75322: Calling all_inventory to load vars for managed-node2 22736 1727204245.75325: Calling groups_inventory to load vars for managed-node2 22736 1727204245.75329: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204245.75340: Calling all_plugins_play to load vars for managed-node2 22736 1727204245.75343: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204245.75347: Calling groups_plugins_play to load vars for managed-node2 22736 1727204245.75557: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204245.75717: done with get_vars() 22736 1727204245.75726: done getting variables 22736 1727204245.75775: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22736 1727204245.75876: variable 'interface' from source: set_fact TASK [Create veth interface lsr27] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Tuesday 24 September 2024 14:57:25 -0400 (0:00:02.056) 0:00:10.543 ***** 22736 1727204245.75908: entering _queue_task() for managed-node2/command 22736 1727204245.76120: worker is 1 (out of 1 available) 22736 1727204245.76135: exiting _queue_task() for managed-node2/command 22736 1727204245.76147: done queuing things up, now waiting for results queue to drain 22736 1727204245.76149: waiting for pending results... 22736 1727204245.76313: running TaskExecutor() for managed-node2/TASK: Create veth interface lsr27 22736 1727204245.76385: in run() - task 12b410aa-8751-4f4a-548a-000000000135 22736 1727204245.76401: variable 'ansible_search_path' from source: unknown 22736 1727204245.76404: variable 'ansible_search_path' from source: unknown 22736 1727204245.76627: variable 'interface' from source: set_fact 22736 1727204245.76694: variable 'interface' from source: set_fact 22736 1727204245.76763: variable 'interface' from source: set_fact 22736 1727204245.76886: Loaded config def from plugin (lookup/items) 22736 1727204245.76895: Loading LookupModule 'items' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/items.py 22736 1727204245.76916: variable 'omit' from source: magic vars 22736 1727204245.77012: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204245.77024: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204245.77036: variable 'omit' from source: magic vars 22736 1727204245.77234: variable 'ansible_distribution_major_version' from source: facts 22736 1727204245.77241: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204245.77411: variable 'type' from source: set_fact 22736 1727204245.77418: variable 'state' from source: include params 22736 1727204245.77423: variable 'interface' from source: set_fact 22736 1727204245.77429: variable 'current_interfaces' from source: set_fact 22736 1727204245.77436: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 22736 1727204245.77443: variable 'omit' from source: magic vars 22736 1727204245.77473: variable 'omit' from source: magic vars 22736 1727204245.77516: variable 'item' from source: unknown 22736 1727204245.77577: variable 'item' from source: unknown 22736 1727204245.77594: variable 'omit' from source: magic vars 22736 1727204245.77622: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204245.77647: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204245.77663: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204245.77681: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204245.77692: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204245.77723: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204245.77727: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204245.77730: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204245.77812: Set connection var ansible_timeout to 10 22736 1727204245.77827: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204245.77836: Set connection var ansible_shell_executable to /bin/sh 22736 1727204245.77839: Set connection var ansible_shell_type to sh 22736 1727204245.77845: Set connection var ansible_pipelining to False 22736 1727204245.77848: Set connection var ansible_connection to ssh 22736 1727204245.77867: variable 'ansible_shell_executable' from source: unknown 22736 1727204245.77870: variable 'ansible_connection' from source: unknown 22736 1727204245.77872: variable 'ansible_module_compression' from source: unknown 22736 1727204245.77877: variable 'ansible_shell_type' from source: unknown 22736 1727204245.77879: variable 'ansible_shell_executable' from source: unknown 22736 1727204245.77884: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204245.77891: variable 'ansible_pipelining' from source: unknown 22736 1727204245.77894: variable 'ansible_timeout' from source: unknown 22736 1727204245.77901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204245.78018: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204245.78028: variable 'omit' from source: magic vars 22736 1727204245.78040: starting attempt loop 22736 1727204245.78043: running the handler 22736 1727204245.78054: _low_level_execute_command(): starting 22736 1727204245.78062: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204245.78574: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204245.78602: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204245.78607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 22736 1727204245.78624: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204245.78676: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204245.78683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204245.78685: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204245.78729: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204245.80515: stdout chunk (state=3): >>>/root <<< 22736 1727204245.80628: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204245.80684: stderr chunk (state=3): >>><<< 22736 1727204245.80688: stdout chunk (state=3): >>><<< 22736 1727204245.80710: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204245.80723: _low_level_execute_command(): starting 22736 1727204245.80732: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204245.8071-23364-45502198613491 `" && echo ansible-tmp-1727204245.8071-23364-45502198613491="` echo /root/.ansible/tmp/ansible-tmp-1727204245.8071-23364-45502198613491 `" ) && sleep 0' 22736 1727204245.81199: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204245.81203: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204245.81206: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 22736 1727204245.81209: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204245.81264: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204245.81267: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204245.81314: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204245.83373: stdout chunk (state=3): >>>ansible-tmp-1727204245.8071-23364-45502198613491=/root/.ansible/tmp/ansible-tmp-1727204245.8071-23364-45502198613491 <<< 22736 1727204245.83495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204245.83540: stderr chunk (state=3): >>><<< 22736 1727204245.83545: stdout chunk (state=3): >>><<< 22736 1727204245.83559: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204245.8071-23364-45502198613491=/root/.ansible/tmp/ansible-tmp-1727204245.8071-23364-45502198613491 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204245.83593: variable 'ansible_module_compression' from source: unknown 22736 1727204245.83637: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22736 1727204245.83664: variable 'ansible_facts' from source: unknown 22736 1727204245.83736: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204245.8071-23364-45502198613491/AnsiballZ_command.py 22736 1727204245.83846: Sending initial data 22736 1727204245.83850: Sent initial data (152 bytes) 22736 1727204245.84308: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204245.84312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204245.84315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204245.84317: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204245.84375: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204245.84380: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204245.84418: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204245.86068: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 22736 1727204245.86078: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204245.86105: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204245.86143: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp3ydfd75y /root/.ansible/tmp/ansible-tmp-1727204245.8071-23364-45502198613491/AnsiballZ_command.py <<< 22736 1727204245.86149: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204245.8071-23364-45502198613491/AnsiballZ_command.py" <<< 22736 1727204245.86181: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp3ydfd75y" to remote "/root/.ansible/tmp/ansible-tmp-1727204245.8071-23364-45502198613491/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204245.8071-23364-45502198613491/AnsiballZ_command.py" <<< 22736 1727204245.86933: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204245.86998: stderr chunk (state=3): >>><<< 22736 1727204245.87002: stdout chunk (state=3): >>><<< 22736 1727204245.87023: done transferring module to remote 22736 1727204245.87034: _low_level_execute_command(): starting 22736 1727204245.87039: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204245.8071-23364-45502198613491/ /root/.ansible/tmp/ansible-tmp-1727204245.8071-23364-45502198613491/AnsiballZ_command.py && sleep 0' 22736 1727204245.87495: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204245.87504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204245.87506: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 22736 1727204245.87509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204245.87511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204245.87560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204245.87564: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204245.87624: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204245.89495: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204245.89545: stderr chunk (state=3): >>><<< 22736 1727204245.89549: stdout chunk (state=3): >>><<< 22736 1727204245.89571: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204245.89575: _low_level_execute_command(): starting 22736 1727204245.89577: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204245.8071-23364-45502198613491/AnsiballZ_command.py && sleep 0' 22736 1727204245.90029: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204245.90033: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204245.90036: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204245.90038: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204245.90041: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204245.90095: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204245.90101: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204245.90143: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204246.08122: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-24 14:57:26.073720", "end": "2024-09-24 14:57:26.078784", "delta": "0:00:00.005064", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22736 1727204246.10874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204246.10878: stdout chunk (state=3): >>><<< 22736 1727204246.10881: stderr chunk (state=3): >>><<< 22736 1727204246.10907: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27"], "start": "2024-09-24 14:57:26.073720", "end": "2024-09-24 14:57:26.078784", "delta": "0:00:00.005064", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link add lsr27 type veth peer name peerlsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204246.11075: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link add lsr27 type veth peer name peerlsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204245.8071-23364-45502198613491/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204246.11079: _low_level_execute_command(): starting 22736 1727204246.11088: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204245.8071-23364-45502198613491/ > /dev/null 2>&1 && sleep 0' 22736 1727204246.11704: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204246.11722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204246.11745: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204246.11852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.11882: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204246.11903: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204246.11928: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204246.12084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204246.16419: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204246.16488: stderr chunk (state=3): >>><<< 22736 1727204246.16696: stdout chunk (state=3): >>><<< 22736 1727204246.16700: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204246.16703: handler run complete 22736 1727204246.16705: Evaluated conditional (False): False 22736 1727204246.16708: attempt loop complete, returning result 22736 1727204246.16710: variable 'item' from source: unknown 22736 1727204246.16730: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link add lsr27 type veth peer name peerlsr27) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "add", "lsr27", "type", "veth", "peer", "name", "peerlsr27" ], "delta": "0:00:00.005064", "end": "2024-09-24 14:57:26.078784", "item": "ip link add lsr27 type veth peer name peerlsr27", "rc": 0, "start": "2024-09-24 14:57:26.073720" } 22736 1727204246.17100: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204246.17103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204246.17106: variable 'omit' from source: magic vars 22736 1727204246.17296: variable 'ansible_distribution_major_version' from source: facts 22736 1727204246.17307: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204246.18022: variable 'type' from source: set_fact 22736 1727204246.18035: variable 'state' from source: include params 22736 1727204246.18045: variable 'interface' from source: set_fact 22736 1727204246.18054: variable 'current_interfaces' from source: set_fact 22736 1727204246.18067: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 22736 1727204246.18098: variable 'omit' from source: magic vars 22736 1727204246.18119: variable 'omit' from source: magic vars 22736 1727204246.18193: variable 'item' from source: unknown 22736 1727204246.18274: variable 'item' from source: unknown 22736 1727204246.18314: variable 'omit' from source: magic vars 22736 1727204246.18408: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204246.18411: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204246.18422: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204246.18425: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204246.18428: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204246.18430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204246.18541: Set connection var ansible_timeout to 10 22736 1727204246.18562: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204246.18578: Set connection var ansible_shell_executable to /bin/sh 22736 1727204246.18586: Set connection var ansible_shell_type to sh 22736 1727204246.18601: Set connection var ansible_pipelining to False 22736 1727204246.18610: Set connection var ansible_connection to ssh 22736 1727204246.18651: variable 'ansible_shell_executable' from source: unknown 22736 1727204246.18694: variable 'ansible_connection' from source: unknown 22736 1727204246.18697: variable 'ansible_module_compression' from source: unknown 22736 1727204246.18700: variable 'ansible_shell_type' from source: unknown 22736 1727204246.18702: variable 'ansible_shell_executable' from source: unknown 22736 1727204246.18704: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204246.18707: variable 'ansible_pipelining' from source: unknown 22736 1727204246.18709: variable 'ansible_timeout' from source: unknown 22736 1727204246.18711: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204246.18891: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204246.18899: variable 'omit' from source: magic vars 22736 1727204246.18902: starting attempt loop 22736 1727204246.18905: running the handler 22736 1727204246.18907: _low_level_execute_command(): starting 22736 1727204246.18909: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204246.19641: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204246.19656: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204246.19734: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.19801: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204246.19842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204246.19858: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204246.19944: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204246.21698: stdout chunk (state=3): >>>/root <<< 22736 1727204246.22024: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204246.22027: stdout chunk (state=3): >>><<< 22736 1727204246.22030: stderr chunk (state=3): >>><<< 22736 1727204246.22033: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204246.22035: _low_level_execute_command(): starting 22736 1727204246.22037: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204246.2192986-23364-213723355662208 `" && echo ansible-tmp-1727204246.2192986-23364-213723355662208="` echo /root/.ansible/tmp/ansible-tmp-1727204246.2192986-23364-213723355662208 `" ) && sleep 0' 22736 1727204246.22639: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204246.22659: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204246.22679: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204246.22703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204246.22753: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204246.22769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204246.22860: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.22878: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204246.22896: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204246.22920: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204246.22994: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204246.25044: stdout chunk (state=3): >>>ansible-tmp-1727204246.2192986-23364-213723355662208=/root/.ansible/tmp/ansible-tmp-1727204246.2192986-23364-213723355662208 <<< 22736 1727204246.25247: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204246.25265: stdout chunk (state=3): >>><<< 22736 1727204246.25278: stderr chunk (state=3): >>><<< 22736 1727204246.25304: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204246.2192986-23364-213723355662208=/root/.ansible/tmp/ansible-tmp-1727204246.2192986-23364-213723355662208 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204246.25495: variable 'ansible_module_compression' from source: unknown 22736 1727204246.25498: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22736 1727204246.25501: variable 'ansible_facts' from source: unknown 22736 1727204246.25503: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204246.2192986-23364-213723355662208/AnsiballZ_command.py 22736 1727204246.25738: Sending initial data 22736 1727204246.25741: Sent initial data (156 bytes) 22736 1727204246.26402: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204246.26517: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.26545: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204246.26563: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204246.26586: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204246.26675: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204246.28363: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204246.28435: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204246.28483: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpuahswu7z /root/.ansible/tmp/ansible-tmp-1727204246.2192986-23364-213723355662208/AnsiballZ_command.py <<< 22736 1727204246.28501: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204246.2192986-23364-213723355662208/AnsiballZ_command.py" <<< 22736 1727204246.28540: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpuahswu7z" to remote "/root/.ansible/tmp/ansible-tmp-1727204246.2192986-23364-213723355662208/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204246.2192986-23364-213723355662208/AnsiballZ_command.py" <<< 22736 1727204246.29654: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204246.29734: stderr chunk (state=3): >>><<< 22736 1727204246.29795: stdout chunk (state=3): >>><<< 22736 1727204246.29799: done transferring module to remote 22736 1727204246.29805: _low_level_execute_command(): starting 22736 1727204246.29819: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204246.2192986-23364-213723355662208/ /root/.ansible/tmp/ansible-tmp-1727204246.2192986-23364-213723355662208/AnsiballZ_command.py && sleep 0' 22736 1727204246.30517: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204246.30565: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204246.30579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204246.30594: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.30687: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204246.30748: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204246.30785: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204246.32915: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204246.32919: stdout chunk (state=3): >>><<< 22736 1727204246.32921: stderr chunk (state=3): >>><<< 22736 1727204246.32924: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204246.32926: _low_level_execute_command(): starting 22736 1727204246.32929: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204246.2192986-23364-213723355662208/AnsiballZ_command.py && sleep 0' 22736 1727204246.33554: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204246.33571: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204246.33588: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204246.33615: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204246.33635: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204246.33709: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.33753: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204246.33779: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204246.33806: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204246.33879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204246.51713: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-24 14:57:26.512409", "end": "2024-09-24 14:57:26.516353", "delta": "0:00:00.003944", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22736 1727204246.53433: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204246.53496: stderr chunk (state=3): >>><<< 22736 1727204246.53500: stdout chunk (state=3): >>><<< 22736 1727204246.53519: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "peerlsr27", "up"], "start": "2024-09-24 14:57:26.512409", "end": "2024-09-24 14:57:26.516353", "delta": "0:00:00.003944", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set peerlsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204246.53550: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set peerlsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204246.2192986-23364-213723355662208/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204246.53557: _low_level_execute_command(): starting 22736 1727204246.53563: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204246.2192986-23364-213723355662208/ > /dev/null 2>&1 && sleep 0' 22736 1727204246.54066: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204246.54070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.54072: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204246.54075: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204246.54077: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.54133: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204246.54139: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204246.54141: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204246.54183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204246.56147: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204246.56200: stderr chunk (state=3): >>><<< 22736 1727204246.56205: stdout chunk (state=3): >>><<< 22736 1727204246.56225: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204246.56231: handler run complete 22736 1727204246.56251: Evaluated conditional (False): False 22736 1727204246.56261: attempt loop complete, returning result 22736 1727204246.56280: variable 'item' from source: unknown 22736 1727204246.56357: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set peerlsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "peerlsr27", "up" ], "delta": "0:00:00.003944", "end": "2024-09-24 14:57:26.516353", "item": "ip link set peerlsr27 up", "rc": 0, "start": "2024-09-24 14:57:26.512409" } 22736 1727204246.56488: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204246.56493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204246.56496: variable 'omit' from source: magic vars 22736 1727204246.56638: variable 'ansible_distribution_major_version' from source: facts 22736 1727204246.56644: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204246.56801: variable 'type' from source: set_fact 22736 1727204246.56805: variable 'state' from source: include params 22736 1727204246.56810: variable 'interface' from source: set_fact 22736 1727204246.56817: variable 'current_interfaces' from source: set_fact 22736 1727204246.56826: Evaluated conditional (type == 'veth' and state == 'present' and interface not in current_interfaces): True 22736 1727204246.56829: variable 'omit' from source: magic vars 22736 1727204246.56846: variable 'omit' from source: magic vars 22736 1727204246.56882: variable 'item' from source: unknown 22736 1727204246.56939: variable 'item' from source: unknown 22736 1727204246.56953: variable 'omit' from source: magic vars 22736 1727204246.56973: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204246.56981: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204246.56988: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204246.57002: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204246.57005: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204246.57010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204246.57075: Set connection var ansible_timeout to 10 22736 1727204246.57085: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204246.57095: Set connection var ansible_shell_executable to /bin/sh 22736 1727204246.57098: Set connection var ansible_shell_type to sh 22736 1727204246.57105: Set connection var ansible_pipelining to False 22736 1727204246.57107: Set connection var ansible_connection to ssh 22736 1727204246.57127: variable 'ansible_shell_executable' from source: unknown 22736 1727204246.57130: variable 'ansible_connection' from source: unknown 22736 1727204246.57133: variable 'ansible_module_compression' from source: unknown 22736 1727204246.57137: variable 'ansible_shell_type' from source: unknown 22736 1727204246.57140: variable 'ansible_shell_executable' from source: unknown 22736 1727204246.57146: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204246.57151: variable 'ansible_pipelining' from source: unknown 22736 1727204246.57154: variable 'ansible_timeout' from source: unknown 22736 1727204246.57165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204246.57241: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204246.57249: variable 'omit' from source: magic vars 22736 1727204246.57254: starting attempt loop 22736 1727204246.57257: running the handler 22736 1727204246.57264: _low_level_execute_command(): starting 22736 1727204246.57272: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204246.57756: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204246.57760: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204246.57762: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204246.57764: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.57831: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204246.57835: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204246.57841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204246.57869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204246.59552: stdout chunk (state=3): >>>/root <<< 22736 1727204246.59665: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204246.59719: stderr chunk (state=3): >>><<< 22736 1727204246.59722: stdout chunk (state=3): >>><<< 22736 1727204246.59736: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204246.59745: _low_level_execute_command(): starting 22736 1727204246.59750: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204246.5973608-23364-167032936067234 `" && echo ansible-tmp-1727204246.5973608-23364-167032936067234="` echo /root/.ansible/tmp/ansible-tmp-1727204246.5973608-23364-167032936067234 `" ) && sleep 0' 22736 1727204246.60239: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204246.60243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.60245: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204246.60247: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204246.60250: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.60295: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204246.60298: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204246.60348: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204246.62354: stdout chunk (state=3): >>>ansible-tmp-1727204246.5973608-23364-167032936067234=/root/.ansible/tmp/ansible-tmp-1727204246.5973608-23364-167032936067234 <<< 22736 1727204246.62467: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204246.62527: stderr chunk (state=3): >>><<< 22736 1727204246.62530: stdout chunk (state=3): >>><<< 22736 1727204246.62545: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204246.5973608-23364-167032936067234=/root/.ansible/tmp/ansible-tmp-1727204246.5973608-23364-167032936067234 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204246.62565: variable 'ansible_module_compression' from source: unknown 22736 1727204246.62596: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22736 1727204246.62619: variable 'ansible_facts' from source: unknown 22736 1727204246.62663: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204246.5973608-23364-167032936067234/AnsiballZ_command.py 22736 1727204246.62768: Sending initial data 22736 1727204246.62771: Sent initial data (156 bytes) 22736 1727204246.63254: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204246.63259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204246.63262: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.63270: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204246.63272: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.63322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204246.63326: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204246.63377: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204246.65118: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204246.65150: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204246.65184: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp1gkbih5q /root/.ansible/tmp/ansible-tmp-1727204246.5973608-23364-167032936067234/AnsiballZ_command.py <<< 22736 1727204246.65193: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204246.5973608-23364-167032936067234/AnsiballZ_command.py" <<< 22736 1727204246.65224: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp1gkbih5q" to remote "/root/.ansible/tmp/ansible-tmp-1727204246.5973608-23364-167032936067234/AnsiballZ_command.py" <<< 22736 1727204246.65229: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204246.5973608-23364-167032936067234/AnsiballZ_command.py" <<< 22736 1727204246.66225: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204246.66229: stderr chunk (state=3): >>><<< 22736 1727204246.66396: stdout chunk (state=3): >>><<< 22736 1727204246.66400: done transferring module to remote 22736 1727204246.66403: _low_level_execute_command(): starting 22736 1727204246.66405: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204246.5973608-23364-167032936067234/ /root/.ansible/tmp/ansible-tmp-1727204246.5973608-23364-167032936067234/AnsiballZ_command.py && sleep 0' 22736 1727204246.66900: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204246.66918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204246.66931: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.66977: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204246.67000: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204246.67033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204246.69198: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204246.69202: stdout chunk (state=3): >>><<< 22736 1727204246.69204: stderr chunk (state=3): >>><<< 22736 1727204246.69207: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204246.69209: _low_level_execute_command(): starting 22736 1727204246.69212: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204246.5973608-23364-167032936067234/AnsiballZ_command.py && sleep 0' 22736 1727204246.70237: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204246.70247: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204246.70264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204246.70282: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204246.70298: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204246.70454: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204246.70516: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204246.70520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204246.88723: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-24 14:57:26.881633", "end": "2024-09-24 14:57:26.885511", "delta": "0:00:00.003878", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22736 1727204246.90383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204246.90416: stderr chunk (state=3): >>><<< 22736 1727204246.90420: stdout chunk (state=3): >>><<< 22736 1727204246.90439: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "set", "lsr27", "up"], "start": "2024-09-24 14:57:26.881633", "end": "2024-09-24 14:57:26.885511", "delta": "0:00:00.003878", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link set lsr27 up", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204246.90465: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link set lsr27 up', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204246.5973608-23364-167032936067234/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204246.90472: _low_level_execute_command(): starting 22736 1727204246.90478: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204246.5973608-23364-167032936067234/ > /dev/null 2>&1 && sleep 0' 22736 1727204246.90951: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204246.90954: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.90957: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204246.90959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204246.90961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.91013: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204246.91022: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204246.91061: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204246.92975: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204246.93023: stderr chunk (state=3): >>><<< 22736 1727204246.93026: stdout chunk (state=3): >>><<< 22736 1727204246.93040: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204246.93046: handler run complete 22736 1727204246.93069: Evaluated conditional (False): False 22736 1727204246.93078: attempt loop complete, returning result 22736 1727204246.93099: variable 'item' from source: unknown 22736 1727204246.93172: variable 'item' from source: unknown ok: [managed-node2] => (item=ip link set lsr27 up) => { "ansible_loop_var": "item", "changed": false, "cmd": [ "ip", "link", "set", "lsr27", "up" ], "delta": "0:00:00.003878", "end": "2024-09-24 14:57:26.885511", "item": "ip link set lsr27 up", "rc": 0, "start": "2024-09-24 14:57:26.881633" } 22736 1727204246.93307: dumping result to json 22736 1727204246.93310: done dumping result, returning 22736 1727204246.93313: done running TaskExecutor() for managed-node2/TASK: Create veth interface lsr27 [12b410aa-8751-4f4a-548a-000000000135] 22736 1727204246.93315: sending task result for task 12b410aa-8751-4f4a-548a-000000000135 22736 1727204246.93365: done sending task result for task 12b410aa-8751-4f4a-548a-000000000135 22736 1727204246.93368: WORKER PROCESS EXITING 22736 1727204246.93458: no more pending results, returning what we have 22736 1727204246.93462: results queue empty 22736 1727204246.93463: checking for any_errors_fatal 22736 1727204246.93469: done checking for any_errors_fatal 22736 1727204246.93470: checking for max_fail_percentage 22736 1727204246.93471: done checking for max_fail_percentage 22736 1727204246.93472: checking to see if all hosts have failed and the running result is not ok 22736 1727204246.93474: done checking to see if all hosts have failed 22736 1727204246.93474: getting the remaining hosts for this loop 22736 1727204246.93476: done getting the remaining hosts for this loop 22736 1727204246.93482: getting the next task for host managed-node2 22736 1727204246.93490: done getting next task for host managed-node2 22736 1727204246.93493: ^ task is: TASK: Set up veth as managed by NetworkManager 22736 1727204246.93497: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204246.93505: getting variables 22736 1727204246.93506: in VariableManager get_vars() 22736 1727204246.93537: Calling all_inventory to load vars for managed-node2 22736 1727204246.93540: Calling groups_inventory to load vars for managed-node2 22736 1727204246.93545: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204246.93557: Calling all_plugins_play to load vars for managed-node2 22736 1727204246.93560: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204246.93563: Calling groups_plugins_play to load vars for managed-node2 22736 1727204246.93943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204246.94097: done with get_vars() 22736 1727204246.94106: done getting variables 22736 1727204246.94156: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set up veth as managed by NetworkManager] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:35 Tuesday 24 September 2024 14:57:26 -0400 (0:00:01.182) 0:00:11.726 ***** 22736 1727204246.94178: entering _queue_task() for managed-node2/command 22736 1727204246.94386: worker is 1 (out of 1 available) 22736 1727204246.94402: exiting _queue_task() for managed-node2/command 22736 1727204246.94414: done queuing things up, now waiting for results queue to drain 22736 1727204246.94416: waiting for pending results... 22736 1727204246.94583: running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager 22736 1727204246.94666: in run() - task 12b410aa-8751-4f4a-548a-000000000136 22736 1727204246.94679: variable 'ansible_search_path' from source: unknown 22736 1727204246.94683: variable 'ansible_search_path' from source: unknown 22736 1727204246.94716: calling self._execute() 22736 1727204246.94786: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204246.94794: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204246.94804: variable 'omit' from source: magic vars 22736 1727204246.95120: variable 'ansible_distribution_major_version' from source: facts 22736 1727204246.95131: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204246.95267: variable 'type' from source: set_fact 22736 1727204246.95271: variable 'state' from source: include params 22736 1727204246.95278: Evaluated conditional (type == 'veth' and state == 'present'): True 22736 1727204246.95284: variable 'omit' from source: magic vars 22736 1727204246.95323: variable 'omit' from source: magic vars 22736 1727204246.95403: variable 'interface' from source: set_fact 22736 1727204246.95423: variable 'omit' from source: magic vars 22736 1727204246.95456: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204246.95487: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204246.95507: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204246.95531: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204246.95540: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204246.95568: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204246.95571: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204246.95576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204246.95663: Set connection var ansible_timeout to 10 22736 1727204246.95673: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204246.95681: Set connection var ansible_shell_executable to /bin/sh 22736 1727204246.95684: Set connection var ansible_shell_type to sh 22736 1727204246.95693: Set connection var ansible_pipelining to False 22736 1727204246.95695: Set connection var ansible_connection to ssh 22736 1727204246.95714: variable 'ansible_shell_executable' from source: unknown 22736 1727204246.95721: variable 'ansible_connection' from source: unknown 22736 1727204246.95723: variable 'ansible_module_compression' from source: unknown 22736 1727204246.95728: variable 'ansible_shell_type' from source: unknown 22736 1727204246.95732: variable 'ansible_shell_executable' from source: unknown 22736 1727204246.95735: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204246.95745: variable 'ansible_pipelining' from source: unknown 22736 1727204246.95748: variable 'ansible_timeout' from source: unknown 22736 1727204246.95750: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204246.95870: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204246.95882: variable 'omit' from source: magic vars 22736 1727204246.95887: starting attempt loop 22736 1727204246.95892: running the handler 22736 1727204246.95908: _low_level_execute_command(): starting 22736 1727204246.95918: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204246.96460: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204246.96464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.96467: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204246.96470: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.96528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204246.96536: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204246.96577: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204246.98319: stdout chunk (state=3): >>>/root <<< 22736 1727204246.98428: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204246.98481: stderr chunk (state=3): >>><<< 22736 1727204246.98486: stdout chunk (state=3): >>><<< 22736 1727204246.98515: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204246.98524: _low_level_execute_command(): starting 22736 1727204246.98530: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204246.985104-23413-171028450900495 `" && echo ansible-tmp-1727204246.985104-23413-171028450900495="` echo /root/.ansible/tmp/ansible-tmp-1727204246.985104-23413-171028450900495 `" ) && sleep 0' 22736 1727204246.98993: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204246.98997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204246.99006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 22736 1727204246.99009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204246.99012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204246.99061: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204246.99067: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204246.99107: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204247.01133: stdout chunk (state=3): >>>ansible-tmp-1727204246.985104-23413-171028450900495=/root/.ansible/tmp/ansible-tmp-1727204246.985104-23413-171028450900495 <<< 22736 1727204247.01250: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204247.01304: stderr chunk (state=3): >>><<< 22736 1727204247.01308: stdout chunk (state=3): >>><<< 22736 1727204247.01329: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204246.985104-23413-171028450900495=/root/.ansible/tmp/ansible-tmp-1727204246.985104-23413-171028450900495 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204247.01357: variable 'ansible_module_compression' from source: unknown 22736 1727204247.01406: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22736 1727204247.01441: variable 'ansible_facts' from source: unknown 22736 1727204247.01533: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204246.985104-23413-171028450900495/AnsiballZ_command.py 22736 1727204247.01623: Sending initial data 22736 1727204247.01627: Sent initial data (155 bytes) 22736 1727204247.02094: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204247.02098: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204247.02100: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204247.02102: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204247.02157: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204247.02161: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204247.02206: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204247.03851: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 22736 1727204247.03854: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204247.03885: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204247.03923: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpuwja2foz /root/.ansible/tmp/ansible-tmp-1727204246.985104-23413-171028450900495/AnsiballZ_command.py <<< 22736 1727204247.03927: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204246.985104-23413-171028450900495/AnsiballZ_command.py" <<< 22736 1727204247.03963: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpuwja2foz" to remote "/root/.ansible/tmp/ansible-tmp-1727204246.985104-23413-171028450900495/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204246.985104-23413-171028450900495/AnsiballZ_command.py" <<< 22736 1727204247.04738: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204247.04805: stderr chunk (state=3): >>><<< 22736 1727204247.04809: stdout chunk (state=3): >>><<< 22736 1727204247.04835: done transferring module to remote 22736 1727204247.04846: _low_level_execute_command(): starting 22736 1727204247.04852: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204246.985104-23413-171028450900495/ /root/.ansible/tmp/ansible-tmp-1727204246.985104-23413-171028450900495/AnsiballZ_command.py && sleep 0' 22736 1727204247.05330: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204247.05334: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204247.05337: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204247.05339: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204247.05345: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204247.05347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204247.05400: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204247.05404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204247.05442: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204247.07331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204247.07381: stderr chunk (state=3): >>><<< 22736 1727204247.07384: stdout chunk (state=3): >>><<< 22736 1727204247.07400: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204247.07404: _low_level_execute_command(): starting 22736 1727204247.07411: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204246.985104-23413-171028450900495/AnsiballZ_command.py && sleep 0' 22736 1727204247.07865: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204247.07868: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204247.07871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204247.07875: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204247.07937: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204247.07940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204247.07983: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204247.27664: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-24 14:57:27.255065", "end": "2024-09-24 14:57:27.275543", "delta": "0:00:00.020478", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22736 1727204247.29465: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204247.29470: stdout chunk (state=3): >>><<< 22736 1727204247.29472: stderr chunk (state=3): >>><<< 22736 1727204247.29498: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["nmcli", "d", "set", "lsr27", "managed", "true"], "start": "2024-09-24 14:57:27.255065", "end": "2024-09-24 14:57:27.275543", "delta": "0:00:00.020478", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli d set lsr27 managed true", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204247.29649: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli d set lsr27 managed true', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204246.985104-23413-171028450900495/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204247.29653: _low_level_execute_command(): starting 22736 1727204247.29656: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204246.985104-23413-171028450900495/ > /dev/null 2>&1 && sleep 0' 22736 1727204247.30311: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204247.30342: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204247.30362: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204247.30385: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204247.30458: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204247.32497: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204247.32514: stdout chunk (state=3): >>><<< 22736 1727204247.32528: stderr chunk (state=3): >>><<< 22736 1727204247.32698: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204247.32702: handler run complete 22736 1727204247.32705: Evaluated conditional (False): False 22736 1727204247.32707: attempt loop complete, returning result 22736 1727204247.32709: _execute() done 22736 1727204247.32712: dumping result to json 22736 1727204247.32714: done dumping result, returning 22736 1727204247.32716: done running TaskExecutor() for managed-node2/TASK: Set up veth as managed by NetworkManager [12b410aa-8751-4f4a-548a-000000000136] 22736 1727204247.32718: sending task result for task 12b410aa-8751-4f4a-548a-000000000136 22736 1727204247.32797: done sending task result for task 12b410aa-8751-4f4a-548a-000000000136 22736 1727204247.32800: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "nmcli", "d", "set", "lsr27", "managed", "true" ], "delta": "0:00:00.020478", "end": "2024-09-24 14:57:27.275543", "rc": 0, "start": "2024-09-24 14:57:27.255065" } 22736 1727204247.32888: no more pending results, returning what we have 22736 1727204247.32893: results queue empty 22736 1727204247.32895: checking for any_errors_fatal 22736 1727204247.32909: done checking for any_errors_fatal 22736 1727204247.32914: checking for max_fail_percentage 22736 1727204247.32916: done checking for max_fail_percentage 22736 1727204247.32917: checking to see if all hosts have failed and the running result is not ok 22736 1727204247.32919: done checking to see if all hosts have failed 22736 1727204247.32920: getting the remaining hosts for this loop 22736 1727204247.32922: done getting the remaining hosts for this loop 22736 1727204247.32927: getting the next task for host managed-node2 22736 1727204247.32935: done getting next task for host managed-node2 22736 1727204247.32937: ^ task is: TASK: Delete veth interface {{ interface }} 22736 1727204247.32941: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204247.32946: getting variables 22736 1727204247.32948: in VariableManager get_vars() 22736 1727204247.32981: Calling all_inventory to load vars for managed-node2 22736 1727204247.32984: Calling groups_inventory to load vars for managed-node2 22736 1727204247.32988: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204247.33210: Calling all_plugins_play to load vars for managed-node2 22736 1727204247.33214: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204247.33218: Calling groups_plugins_play to load vars for managed-node2 22736 1727204247.33550: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204247.33848: done with get_vars() 22736 1727204247.33861: done getting variables 22736 1727204247.33931: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22736 1727204247.34080: variable 'interface' from source: set_fact TASK [Delete veth interface lsr27] ********************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:43 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.399) 0:00:12.126 ***** 22736 1727204247.34116: entering _queue_task() for managed-node2/command 22736 1727204247.34499: worker is 1 (out of 1 available) 22736 1727204247.34512: exiting _queue_task() for managed-node2/command 22736 1727204247.34524: done queuing things up, now waiting for results queue to drain 22736 1727204247.34525: waiting for pending results... 22736 1727204247.34808: running TaskExecutor() for managed-node2/TASK: Delete veth interface lsr27 22736 1727204247.34828: in run() - task 12b410aa-8751-4f4a-548a-000000000137 22736 1727204247.34848: variable 'ansible_search_path' from source: unknown 22736 1727204247.34857: variable 'ansible_search_path' from source: unknown 22736 1727204247.34934: calling self._execute() 22736 1727204247.35003: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204247.35022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204247.35044: variable 'omit' from source: magic vars 22736 1727204247.35558: variable 'ansible_distribution_major_version' from source: facts 22736 1727204247.35562: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204247.35815: variable 'type' from source: set_fact 22736 1727204247.35827: variable 'state' from source: include params 22736 1727204247.35837: variable 'interface' from source: set_fact 22736 1727204247.35846: variable 'current_interfaces' from source: set_fact 22736 1727204247.35860: Evaluated conditional (type == 'veth' and state == 'absent' and interface in current_interfaces): False 22736 1727204247.35868: when evaluation is False, skipping this task 22736 1727204247.35876: _execute() done 22736 1727204247.35885: dumping result to json 22736 1727204247.35899: done dumping result, returning 22736 1727204247.35917: done running TaskExecutor() for managed-node2/TASK: Delete veth interface lsr27 [12b410aa-8751-4f4a-548a-000000000137] 22736 1727204247.35995: sending task result for task 12b410aa-8751-4f4a-548a-000000000137 22736 1727204247.36068: done sending task result for task 12b410aa-8751-4f4a-548a-000000000137 22736 1727204247.36072: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'veth' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22736 1727204247.36132: no more pending results, returning what we have 22736 1727204247.36137: results queue empty 22736 1727204247.36138: checking for any_errors_fatal 22736 1727204247.36150: done checking for any_errors_fatal 22736 1727204247.36151: checking for max_fail_percentage 22736 1727204247.36153: done checking for max_fail_percentage 22736 1727204247.36154: checking to see if all hosts have failed and the running result is not ok 22736 1727204247.36155: done checking to see if all hosts have failed 22736 1727204247.36156: getting the remaining hosts for this loop 22736 1727204247.36157: done getting the remaining hosts for this loop 22736 1727204247.36162: getting the next task for host managed-node2 22736 1727204247.36170: done getting next task for host managed-node2 22736 1727204247.36174: ^ task is: TASK: Create dummy interface {{ interface }} 22736 1727204247.36178: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204247.36183: getting variables 22736 1727204247.36185: in VariableManager get_vars() 22736 1727204247.36218: Calling all_inventory to load vars for managed-node2 22736 1727204247.36222: Calling groups_inventory to load vars for managed-node2 22736 1727204247.36226: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204247.36358: Calling all_plugins_play to load vars for managed-node2 22736 1727204247.36363: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204247.36368: Calling groups_plugins_play to load vars for managed-node2 22736 1727204247.36683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204247.37030: done with get_vars() 22736 1727204247.37041: done getting variables 22736 1727204247.37114: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22736 1727204247.37246: variable 'interface' from source: set_fact TASK [Create dummy interface lsr27] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:49 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.031) 0:00:12.157 ***** 22736 1727204247.37279: entering _queue_task() for managed-node2/command 22736 1727204247.37534: worker is 1 (out of 1 available) 22736 1727204247.37661: exiting _queue_task() for managed-node2/command 22736 1727204247.37673: done queuing things up, now waiting for results queue to drain 22736 1727204247.37674: waiting for pending results... 22736 1727204247.37907: running TaskExecutor() for managed-node2/TASK: Create dummy interface lsr27 22736 1727204247.37990: in run() - task 12b410aa-8751-4f4a-548a-000000000138 22736 1727204247.38001: variable 'ansible_search_path' from source: unknown 22736 1727204247.38005: variable 'ansible_search_path' from source: unknown 22736 1727204247.38098: calling self._execute() 22736 1727204247.38141: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204247.38155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204247.38170: variable 'omit' from source: magic vars 22736 1727204247.38606: variable 'ansible_distribution_major_version' from source: facts 22736 1727204247.38624: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204247.38972: variable 'type' from source: set_fact 22736 1727204247.38975: variable 'state' from source: include params 22736 1727204247.38978: variable 'interface' from source: set_fact 22736 1727204247.38980: variable 'current_interfaces' from source: set_fact 22736 1727204247.38983: Evaluated conditional (type == 'dummy' and state == 'present' and interface not in current_interfaces): False 22736 1727204247.38985: when evaluation is False, skipping this task 22736 1727204247.38987: _execute() done 22736 1727204247.38991: dumping result to json 22736 1727204247.38994: done dumping result, returning 22736 1727204247.38997: done running TaskExecutor() for managed-node2/TASK: Create dummy interface lsr27 [12b410aa-8751-4f4a-548a-000000000138] 22736 1727204247.39008: sending task result for task 12b410aa-8751-4f4a-548a-000000000138 22736 1727204247.39198: done sending task result for task 12b410aa-8751-4f4a-548a-000000000138 22736 1727204247.39202: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 22736 1727204247.39255: no more pending results, returning what we have 22736 1727204247.39259: results queue empty 22736 1727204247.39260: checking for any_errors_fatal 22736 1727204247.39265: done checking for any_errors_fatal 22736 1727204247.39266: checking for max_fail_percentage 22736 1727204247.39268: done checking for max_fail_percentage 22736 1727204247.39269: checking to see if all hosts have failed and the running result is not ok 22736 1727204247.39270: done checking to see if all hosts have failed 22736 1727204247.39271: getting the remaining hosts for this loop 22736 1727204247.39273: done getting the remaining hosts for this loop 22736 1727204247.39278: getting the next task for host managed-node2 22736 1727204247.39285: done getting next task for host managed-node2 22736 1727204247.39288: ^ task is: TASK: Delete dummy interface {{ interface }} 22736 1727204247.39294: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204247.39299: getting variables 22736 1727204247.39301: in VariableManager get_vars() 22736 1727204247.39332: Calling all_inventory to load vars for managed-node2 22736 1727204247.39335: Calling groups_inventory to load vars for managed-node2 22736 1727204247.39340: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204247.39356: Calling all_plugins_play to load vars for managed-node2 22736 1727204247.39360: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204247.39364: Calling groups_plugins_play to load vars for managed-node2 22736 1727204247.39751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204247.40049: done with get_vars() 22736 1727204247.40060: done getting variables 22736 1727204247.40126: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22736 1727204247.40260: variable 'interface' from source: set_fact TASK [Delete dummy interface lsr27] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:54 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.030) 0:00:12.187 ***** 22736 1727204247.40294: entering _queue_task() for managed-node2/command 22736 1727204247.40546: worker is 1 (out of 1 available) 22736 1727204247.40560: exiting _queue_task() for managed-node2/command 22736 1727204247.40573: done queuing things up, now waiting for results queue to drain 22736 1727204247.40575: waiting for pending results... 22736 1727204247.40919: running TaskExecutor() for managed-node2/TASK: Delete dummy interface lsr27 22736 1727204247.40961: in run() - task 12b410aa-8751-4f4a-548a-000000000139 22736 1727204247.40981: variable 'ansible_search_path' from source: unknown 22736 1727204247.40991: variable 'ansible_search_path' from source: unknown 22736 1727204247.41039: calling self._execute() 22736 1727204247.41133: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204247.41151: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204247.41166: variable 'omit' from source: magic vars 22736 1727204247.41669: variable 'ansible_distribution_major_version' from source: facts 22736 1727204247.41673: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204247.41908: variable 'type' from source: set_fact 22736 1727204247.41920: variable 'state' from source: include params 22736 1727204247.41929: variable 'interface' from source: set_fact 22736 1727204247.41938: variable 'current_interfaces' from source: set_fact 22736 1727204247.41950: Evaluated conditional (type == 'dummy' and state == 'absent' and interface in current_interfaces): False 22736 1727204247.41958: when evaluation is False, skipping this task 22736 1727204247.41965: _execute() done 22736 1727204247.41973: dumping result to json 22736 1727204247.41981: done dumping result, returning 22736 1727204247.41995: done running TaskExecutor() for managed-node2/TASK: Delete dummy interface lsr27 [12b410aa-8751-4f4a-548a-000000000139] 22736 1727204247.42010: sending task result for task 12b410aa-8751-4f4a-548a-000000000139 22736 1727204247.42239: done sending task result for task 12b410aa-8751-4f4a-548a-000000000139 22736 1727204247.42242: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'dummy' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22736 1727204247.42295: no more pending results, returning what we have 22736 1727204247.42299: results queue empty 22736 1727204247.42300: checking for any_errors_fatal 22736 1727204247.42306: done checking for any_errors_fatal 22736 1727204247.42307: checking for max_fail_percentage 22736 1727204247.42309: done checking for max_fail_percentage 22736 1727204247.42310: checking to see if all hosts have failed and the running result is not ok 22736 1727204247.42311: done checking to see if all hosts have failed 22736 1727204247.42312: getting the remaining hosts for this loop 22736 1727204247.42314: done getting the remaining hosts for this loop 22736 1727204247.42318: getting the next task for host managed-node2 22736 1727204247.42329: done getting next task for host managed-node2 22736 1727204247.42332: ^ task is: TASK: Create tap interface {{ interface }} 22736 1727204247.42336: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204247.42341: getting variables 22736 1727204247.42343: in VariableManager get_vars() 22736 1727204247.42374: Calling all_inventory to load vars for managed-node2 22736 1727204247.42378: Calling groups_inventory to load vars for managed-node2 22736 1727204247.42382: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204247.42397: Calling all_plugins_play to load vars for managed-node2 22736 1727204247.42401: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204247.42405: Calling groups_plugins_play to load vars for managed-node2 22736 1727204247.42788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204247.43029: done with get_vars() 22736 1727204247.43041: done getting variables 22736 1727204247.43130: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22736 1727204247.43264: variable 'interface' from source: set_fact TASK [Create tap interface lsr27] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:60 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.030) 0:00:12.217 ***** 22736 1727204247.43304: entering _queue_task() for managed-node2/command 22736 1727204247.43591: worker is 1 (out of 1 available) 22736 1727204247.43607: exiting _queue_task() for managed-node2/command 22736 1727204247.43624: done queuing things up, now waiting for results queue to drain 22736 1727204247.43626: waiting for pending results... 22736 1727204247.43972: running TaskExecutor() for managed-node2/TASK: Create tap interface lsr27 22736 1727204247.43996: in run() - task 12b410aa-8751-4f4a-548a-00000000013a 22736 1727204247.44022: variable 'ansible_search_path' from source: unknown 22736 1727204247.44032: variable 'ansible_search_path' from source: unknown 22736 1727204247.44083: calling self._execute() 22736 1727204247.44187: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204247.44205: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204247.44223: variable 'omit' from source: magic vars 22736 1727204247.44669: variable 'ansible_distribution_major_version' from source: facts 22736 1727204247.44688: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204247.44980: variable 'type' from source: set_fact 22736 1727204247.44995: variable 'state' from source: include params 22736 1727204247.45006: variable 'interface' from source: set_fact 22736 1727204247.45016: variable 'current_interfaces' from source: set_fact 22736 1727204247.45054: Evaluated conditional (type == 'tap' and state == 'present' and interface not in current_interfaces): False 22736 1727204247.45057: when evaluation is False, skipping this task 22736 1727204247.45060: _execute() done 22736 1727204247.45062: dumping result to json 22736 1727204247.45065: done dumping result, returning 22736 1727204247.45073: done running TaskExecutor() for managed-node2/TASK: Create tap interface lsr27 [12b410aa-8751-4f4a-548a-00000000013a] 22736 1727204247.45163: sending task result for task 12b410aa-8751-4f4a-548a-00000000013a 22736 1727204247.45272: done sending task result for task 12b410aa-8751-4f4a-548a-00000000013a 22736 1727204247.45276: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'present' and interface not in current_interfaces", "skip_reason": "Conditional result was False" } 22736 1727204247.45334: no more pending results, returning what we have 22736 1727204247.45339: results queue empty 22736 1727204247.45340: checking for any_errors_fatal 22736 1727204247.45351: done checking for any_errors_fatal 22736 1727204247.45352: checking for max_fail_percentage 22736 1727204247.45354: done checking for max_fail_percentage 22736 1727204247.45355: checking to see if all hosts have failed and the running result is not ok 22736 1727204247.45356: done checking to see if all hosts have failed 22736 1727204247.45357: getting the remaining hosts for this loop 22736 1727204247.45359: done getting the remaining hosts for this loop 22736 1727204247.45364: getting the next task for host managed-node2 22736 1727204247.45371: done getting next task for host managed-node2 22736 1727204247.45562: ^ task is: TASK: Delete tap interface {{ interface }} 22736 1727204247.45566: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204247.45571: getting variables 22736 1727204247.45572: in VariableManager get_vars() 22736 1727204247.45605: Calling all_inventory to load vars for managed-node2 22736 1727204247.45608: Calling groups_inventory to load vars for managed-node2 22736 1727204247.45612: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204247.45623: Calling all_plugins_play to load vars for managed-node2 22736 1727204247.45627: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204247.45631: Calling groups_plugins_play to load vars for managed-node2 22736 1727204247.45873: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204247.46167: done with get_vars() 22736 1727204247.46180: done getting variables 22736 1727204247.46256: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22736 1727204247.46395: variable 'interface' from source: set_fact TASK [Delete tap interface lsr27] ********************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:65 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.031) 0:00:12.249 ***** 22736 1727204247.46433: entering _queue_task() for managed-node2/command 22736 1727204247.46917: worker is 1 (out of 1 available) 22736 1727204247.46930: exiting _queue_task() for managed-node2/command 22736 1727204247.46942: done queuing things up, now waiting for results queue to drain 22736 1727204247.46944: waiting for pending results... 22736 1727204247.47075: running TaskExecutor() for managed-node2/TASK: Delete tap interface lsr27 22736 1727204247.47158: in run() - task 12b410aa-8751-4f4a-548a-00000000013b 22736 1727204247.47193: variable 'ansible_search_path' from source: unknown 22736 1727204247.47282: variable 'ansible_search_path' from source: unknown 22736 1727204247.47287: calling self._execute() 22736 1727204247.47349: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204247.47364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204247.47380: variable 'omit' from source: magic vars 22736 1727204247.47850: variable 'ansible_distribution_major_version' from source: facts 22736 1727204247.47869: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204247.48247: variable 'type' from source: set_fact 22736 1727204247.48268: variable 'state' from source: include params 22736 1727204247.48284: variable 'interface' from source: set_fact 22736 1727204247.48298: variable 'current_interfaces' from source: set_fact 22736 1727204247.48312: Evaluated conditional (type == 'tap' and state == 'absent' and interface in current_interfaces): False 22736 1727204247.48321: when evaluation is False, skipping this task 22736 1727204247.48370: _execute() done 22736 1727204247.48374: dumping result to json 22736 1727204247.48376: done dumping result, returning 22736 1727204247.48380: done running TaskExecutor() for managed-node2/TASK: Delete tap interface lsr27 [12b410aa-8751-4f4a-548a-00000000013b] 22736 1727204247.48385: sending task result for task 12b410aa-8751-4f4a-548a-00000000013b skipping: [managed-node2] => { "changed": false, "false_condition": "type == 'tap' and state == 'absent' and interface in current_interfaces", "skip_reason": "Conditional result was False" } 22736 1727204247.48541: no more pending results, returning what we have 22736 1727204247.48546: results queue empty 22736 1727204247.48547: checking for any_errors_fatal 22736 1727204247.48554: done checking for any_errors_fatal 22736 1727204247.48555: checking for max_fail_percentage 22736 1727204247.48556: done checking for max_fail_percentage 22736 1727204247.48557: checking to see if all hosts have failed and the running result is not ok 22736 1727204247.48558: done checking to see if all hosts have failed 22736 1727204247.48559: getting the remaining hosts for this loop 22736 1727204247.48561: done getting the remaining hosts for this loop 22736 1727204247.48566: getting the next task for host managed-node2 22736 1727204247.48575: done getting next task for host managed-node2 22736 1727204247.48579: ^ task is: TASK: Include the task 'assert_device_present.yml' 22736 1727204247.48793: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204247.48798: getting variables 22736 1727204247.48800: in VariableManager get_vars() 22736 1727204247.48828: Calling all_inventory to load vars for managed-node2 22736 1727204247.48832: Calling groups_inventory to load vars for managed-node2 22736 1727204247.48835: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204247.48847: Calling all_plugins_play to load vars for managed-node2 22736 1727204247.48850: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204247.48854: Calling groups_plugins_play to load vars for managed-node2 22736 1727204247.49155: done sending task result for task 12b410aa-8751-4f4a-548a-00000000013b 22736 1727204247.49159: WORKER PROCESS EXITING 22736 1727204247.49187: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204247.49472: done with get_vars() 22736 1727204247.49483: done getting variables TASK [Include the task 'assert_device_present.yml'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:30 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.031) 0:00:12.280 ***** 22736 1727204247.49600: entering _queue_task() for managed-node2/include_tasks 22736 1727204247.50000: worker is 1 (out of 1 available) 22736 1727204247.50014: exiting _queue_task() for managed-node2/include_tasks 22736 1727204247.50027: done queuing things up, now waiting for results queue to drain 22736 1727204247.50029: waiting for pending results... 22736 1727204247.50214: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_present.yml' 22736 1727204247.50337: in run() - task 12b410aa-8751-4f4a-548a-000000000012 22736 1727204247.50366: variable 'ansible_search_path' from source: unknown 22736 1727204247.50412: calling self._execute() 22736 1727204247.50512: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204247.50528: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204247.50549: variable 'omit' from source: magic vars 22736 1727204247.51000: variable 'ansible_distribution_major_version' from source: facts 22736 1727204247.51023: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204247.51034: _execute() done 22736 1727204247.51043: dumping result to json 22736 1727204247.51051: done dumping result, returning 22736 1727204247.51063: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_present.yml' [12b410aa-8751-4f4a-548a-000000000012] 22736 1727204247.51073: sending task result for task 12b410aa-8751-4f4a-548a-000000000012 22736 1727204247.51332: no more pending results, returning what we have 22736 1727204247.51338: in VariableManager get_vars() 22736 1727204247.51375: Calling all_inventory to load vars for managed-node2 22736 1727204247.51378: Calling groups_inventory to load vars for managed-node2 22736 1727204247.51382: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204247.51401: Calling all_plugins_play to load vars for managed-node2 22736 1727204247.51405: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204247.51409: Calling groups_plugins_play to load vars for managed-node2 22736 1727204247.51762: done sending task result for task 12b410aa-8751-4f4a-548a-000000000012 22736 1727204247.51765: WORKER PROCESS EXITING 22736 1727204247.51795: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204247.52057: done with get_vars() 22736 1727204247.52065: variable 'ansible_search_path' from source: unknown 22736 1727204247.52082: we have included files to process 22736 1727204247.52083: generating all_blocks data 22736 1727204247.52086: done generating all_blocks data 22736 1727204247.52092: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 22736 1727204247.52093: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 22736 1727204247.52096: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 22736 1727204247.52296: in VariableManager get_vars() 22736 1727204247.52314: done with get_vars() 22736 1727204247.52465: done processing included file 22736 1727204247.52467: iterating over new_blocks loaded from include file 22736 1727204247.52469: in VariableManager get_vars() 22736 1727204247.52481: done with get_vars() 22736 1727204247.52482: filtering new block on tags 22736 1727204247.52506: done filtering new block on tags 22736 1727204247.52514: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml for managed-node2 22736 1727204247.52520: extending task lists for all hosts with included blocks 22736 1727204247.53363: done extending task lists 22736 1727204247.53365: done processing included files 22736 1727204247.53366: results queue empty 22736 1727204247.53367: checking for any_errors_fatal 22736 1727204247.53370: done checking for any_errors_fatal 22736 1727204247.53371: checking for max_fail_percentage 22736 1727204247.53372: done checking for max_fail_percentage 22736 1727204247.53373: checking to see if all hosts have failed and the running result is not ok 22736 1727204247.53374: done checking to see if all hosts have failed 22736 1727204247.53375: getting the remaining hosts for this loop 22736 1727204247.53376: done getting the remaining hosts for this loop 22736 1727204247.53384: getting the next task for host managed-node2 22736 1727204247.53388: done getting next task for host managed-node2 22736 1727204247.53392: ^ task is: TASK: Include the task 'get_interface_stat.yml' 22736 1727204247.53395: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204247.53398: getting variables 22736 1727204247.53399: in VariableManager get_vars() 22736 1727204247.53408: Calling all_inventory to load vars for managed-node2 22736 1727204247.53411: Calling groups_inventory to load vars for managed-node2 22736 1727204247.53414: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204247.53419: Calling all_plugins_play to load vars for managed-node2 22736 1727204247.53423: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204247.53426: Calling groups_plugins_play to load vars for managed-node2 22736 1727204247.53629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204247.53915: done with get_vars() 22736 1727204247.53930: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.044) 0:00:12.325 ***** 22736 1727204247.54013: entering _queue_task() for managed-node2/include_tasks 22736 1727204247.54426: worker is 1 (out of 1 available) 22736 1727204247.54436: exiting _queue_task() for managed-node2/include_tasks 22736 1727204247.54446: done queuing things up, now waiting for results queue to drain 22736 1727204247.54448: waiting for pending results... 22736 1727204247.54606: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 22736 1727204247.54732: in run() - task 12b410aa-8751-4f4a-548a-0000000001d3 22736 1727204247.54754: variable 'ansible_search_path' from source: unknown 22736 1727204247.54762: variable 'ansible_search_path' from source: unknown 22736 1727204247.54813: calling self._execute() 22736 1727204247.54908: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204247.54921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204247.55009: variable 'omit' from source: magic vars 22736 1727204247.55407: variable 'ansible_distribution_major_version' from source: facts 22736 1727204247.55425: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204247.55442: _execute() done 22736 1727204247.55454: dumping result to json 22736 1727204247.55463: done dumping result, returning 22736 1727204247.55473: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-4f4a-548a-0000000001d3] 22736 1727204247.55482: sending task result for task 12b410aa-8751-4f4a-548a-0000000001d3 22736 1727204247.55688: no more pending results, returning what we have 22736 1727204247.55696: in VariableManager get_vars() 22736 1727204247.55733: Calling all_inventory to load vars for managed-node2 22736 1727204247.55737: Calling groups_inventory to load vars for managed-node2 22736 1727204247.55741: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204247.55758: Calling all_plugins_play to load vars for managed-node2 22736 1727204247.55762: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204247.55767: Calling groups_plugins_play to load vars for managed-node2 22736 1727204247.56183: done sending task result for task 12b410aa-8751-4f4a-548a-0000000001d3 22736 1727204247.56186: WORKER PROCESS EXITING 22736 1727204247.56220: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204247.56503: done with get_vars() 22736 1727204247.56512: variable 'ansible_search_path' from source: unknown 22736 1727204247.56513: variable 'ansible_search_path' from source: unknown 22736 1727204247.56560: we have included files to process 22736 1727204247.56561: generating all_blocks data 22736 1727204247.56563: done generating all_blocks data 22736 1727204247.56565: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22736 1727204247.56566: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22736 1727204247.56569: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22736 1727204247.56845: done processing included file 22736 1727204247.56848: iterating over new_blocks loaded from include file 22736 1727204247.56850: in VariableManager get_vars() 22736 1727204247.56870: done with get_vars() 22736 1727204247.56872: filtering new block on tags 22736 1727204247.56893: done filtering new block on tags 22736 1727204247.56896: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 22736 1727204247.56901: extending task lists for all hosts with included blocks 22736 1727204247.57042: done extending task lists 22736 1727204247.57044: done processing included files 22736 1727204247.57045: results queue empty 22736 1727204247.57045: checking for any_errors_fatal 22736 1727204247.57050: done checking for any_errors_fatal 22736 1727204247.57051: checking for max_fail_percentage 22736 1727204247.57052: done checking for max_fail_percentage 22736 1727204247.57053: checking to see if all hosts have failed and the running result is not ok 22736 1727204247.57054: done checking to see if all hosts have failed 22736 1727204247.57055: getting the remaining hosts for this loop 22736 1727204247.57056: done getting the remaining hosts for this loop 22736 1727204247.57059: getting the next task for host managed-node2 22736 1727204247.57065: done getting next task for host managed-node2 22736 1727204247.57067: ^ task is: TASK: Get stat for interface {{ interface }} 22736 1727204247.57071: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204247.57073: getting variables 22736 1727204247.57074: in VariableManager get_vars() 22736 1727204247.57088: Calling all_inventory to load vars for managed-node2 22736 1727204247.57092: Calling groups_inventory to load vars for managed-node2 22736 1727204247.57096: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204247.57101: Calling all_plugins_play to load vars for managed-node2 22736 1727204247.57105: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204247.57108: Calling groups_plugins_play to load vars for managed-node2 22736 1727204247.57343: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204247.57631: done with get_vars() 22736 1727204247.57643: done getting variables 22736 1727204247.57828: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.038) 0:00:12.363 ***** 22736 1727204247.57864: entering _queue_task() for managed-node2/stat 22736 1727204247.58150: worker is 1 (out of 1 available) 22736 1727204247.58278: exiting _queue_task() for managed-node2/stat 22736 1727204247.58294: done queuing things up, now waiting for results queue to drain 22736 1727204247.58296: waiting for pending results... 22736 1727204247.58498: running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr27 22736 1727204247.58651: in run() - task 12b410aa-8751-4f4a-548a-00000000021e 22736 1727204247.58673: variable 'ansible_search_path' from source: unknown 22736 1727204247.58682: variable 'ansible_search_path' from source: unknown 22736 1727204247.58736: calling self._execute() 22736 1727204247.58838: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204247.58933: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204247.58938: variable 'omit' from source: magic vars 22736 1727204247.59335: variable 'ansible_distribution_major_version' from source: facts 22736 1727204247.59355: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204247.59373: variable 'omit' from source: magic vars 22736 1727204247.59444: variable 'omit' from source: magic vars 22736 1727204247.59572: variable 'interface' from source: set_fact 22736 1727204247.59605: variable 'omit' from source: magic vars 22736 1727204247.59698: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204247.59707: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204247.59735: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204247.59764: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204247.59783: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204247.59833: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204247.59843: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204247.59853: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204247.59993: Set connection var ansible_timeout to 10 22736 1727204247.60016: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204247.60043: Set connection var ansible_shell_executable to /bin/sh 22736 1727204247.60051: Set connection var ansible_shell_type to sh 22736 1727204247.60062: Set connection var ansible_pipelining to False 22736 1727204247.60070: Set connection var ansible_connection to ssh 22736 1727204247.60101: variable 'ansible_shell_executable' from source: unknown 22736 1727204247.60110: variable 'ansible_connection' from source: unknown 22736 1727204247.60135: variable 'ansible_module_compression' from source: unknown 22736 1727204247.60138: variable 'ansible_shell_type' from source: unknown 22736 1727204247.60141: variable 'ansible_shell_executable' from source: unknown 22736 1727204247.60195: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204247.60199: variable 'ansible_pipelining' from source: unknown 22736 1727204247.60201: variable 'ansible_timeout' from source: unknown 22736 1727204247.60203: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204247.60430: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204247.60449: variable 'omit' from source: magic vars 22736 1727204247.60468: starting attempt loop 22736 1727204247.60477: running the handler 22736 1727204247.60501: _low_level_execute_command(): starting 22736 1727204247.60515: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204247.61352: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204247.61427: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204247.61460: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204247.61497: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204247.61552: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204247.63299: stdout chunk (state=3): >>>/root <<< 22736 1727204247.63415: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204247.63495: stderr chunk (state=3): >>><<< 22736 1727204247.63499: stdout chunk (state=3): >>><<< 22736 1727204247.63521: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204247.63632: _low_level_execute_command(): starting 22736 1727204247.63637: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204247.6352944-23432-267901383193856 `" && echo ansible-tmp-1727204247.6352944-23432-267901383193856="` echo /root/.ansible/tmp/ansible-tmp-1727204247.6352944-23432-267901383193856 `" ) && sleep 0' 22736 1727204247.64172: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204247.64186: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204247.64308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204247.64335: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204247.64353: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204247.64373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204247.64438: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204247.66464: stdout chunk (state=3): >>>ansible-tmp-1727204247.6352944-23432-267901383193856=/root/.ansible/tmp/ansible-tmp-1727204247.6352944-23432-267901383193856 <<< 22736 1727204247.66604: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204247.66682: stderr chunk (state=3): >>><<< 22736 1727204247.66699: stdout chunk (state=3): >>><<< 22736 1727204247.66729: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204247.6352944-23432-267901383193856=/root/.ansible/tmp/ansible-tmp-1727204247.6352944-23432-267901383193856 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204247.66888: variable 'ansible_module_compression' from source: unknown 22736 1727204247.66892: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 22736 1727204247.66931: variable 'ansible_facts' from source: unknown 22736 1727204247.67043: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204247.6352944-23432-267901383193856/AnsiballZ_stat.py 22736 1727204247.67216: Sending initial data 22736 1727204247.67244: Sent initial data (153 bytes) 22736 1727204247.67893: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204247.67903: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204247.67921: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204247.68002: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204247.68038: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204247.68050: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204247.68060: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204247.68150: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204247.70090: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204247.70097: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204247.70101: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmprm52jhga /root/.ansible/tmp/ansible-tmp-1727204247.6352944-23432-267901383193856/AnsiballZ_stat.py <<< 22736 1727204247.70103: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204247.6352944-23432-267901383193856/AnsiballZ_stat.py" <<< 22736 1727204247.70106: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmprm52jhga" to remote "/root/.ansible/tmp/ansible-tmp-1727204247.6352944-23432-267901383193856/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204247.6352944-23432-267901383193856/AnsiballZ_stat.py" <<< 22736 1727204247.70988: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204247.71106: stderr chunk (state=3): >>><<< 22736 1727204247.71123: stdout chunk (state=3): >>><<< 22736 1727204247.71152: done transferring module to remote 22736 1727204247.71170: _low_level_execute_command(): starting 22736 1727204247.71180: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204247.6352944-23432-267901383193856/ /root/.ansible/tmp/ansible-tmp-1727204247.6352944-23432-267901383193856/AnsiballZ_stat.py && sleep 0' 22736 1727204247.71877: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204247.71978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204247.72020: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204247.72038: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204247.72059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204247.72127: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204247.74118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204247.74136: stdout chunk (state=3): >>><<< 22736 1727204247.74154: stderr chunk (state=3): >>><<< 22736 1727204247.74196: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204247.74200: _low_level_execute_command(): starting 22736 1727204247.74203: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204247.6352944-23432-267901383193856/AnsiballZ_stat.py && sleep 0' 22736 1727204247.74904: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204247.74925: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204247.74943: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204247.75072: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204247.75099: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204247.75192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204247.92760: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 36826, "dev": 23, "nlink": 1, "atime": 1727204246.077491, "mtime": 1727204246.077491, "ctime": 1727204246.077491, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 22736 1727204247.94175: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204247.94242: stderr chunk (state=3): >>><<< 22736 1727204247.94249: stdout chunk (state=3): >>><<< 22736 1727204247.94268: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/lsr27", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 36826, "dev": 23, "nlink": 1, "atime": 1727204246.077491, "mtime": 1727204246.077491, "ctime": 1727204246.077491, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204247.94322: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204247.6352944-23432-267901383193856/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204247.94333: _low_level_execute_command(): starting 22736 1727204247.94339: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204247.6352944-23432-267901383193856/ > /dev/null 2>&1 && sleep 0' 22736 1727204247.94825: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204247.94829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204247.94832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 22736 1727204247.94840: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204247.94901: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204247.94904: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204247.94940: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204247.96856: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204247.96908: stderr chunk (state=3): >>><<< 22736 1727204247.96912: stdout chunk (state=3): >>><<< 22736 1727204247.96929: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204247.96941: handler run complete 22736 1727204247.96982: attempt loop complete, returning result 22736 1727204247.96985: _execute() done 22736 1727204247.96991: dumping result to json 22736 1727204247.96999: done dumping result, returning 22736 1727204247.97007: done running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr27 [12b410aa-8751-4f4a-548a-00000000021e] 22736 1727204247.97016: sending task result for task 12b410aa-8751-4f4a-548a-00000000021e 22736 1727204247.97129: done sending task result for task 12b410aa-8751-4f4a-548a-00000000021e 22736 1727204247.97132: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "atime": 1727204246.077491, "block_size": 4096, "blocks": 0, "ctime": 1727204246.077491, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 36826, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/lsr27", "lnk_target": "../../devices/virtual/net/lsr27", "mode": "0777", "mtime": 1727204246.077491, "nlink": 1, "path": "/sys/class/net/lsr27", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 22736 1727204247.97252: no more pending results, returning what we have 22736 1727204247.97256: results queue empty 22736 1727204247.97257: checking for any_errors_fatal 22736 1727204247.97259: done checking for any_errors_fatal 22736 1727204247.97260: checking for max_fail_percentage 22736 1727204247.97261: done checking for max_fail_percentage 22736 1727204247.97262: checking to see if all hosts have failed and the running result is not ok 22736 1727204247.97263: done checking to see if all hosts have failed 22736 1727204247.97264: getting the remaining hosts for this loop 22736 1727204247.97265: done getting the remaining hosts for this loop 22736 1727204247.97269: getting the next task for host managed-node2 22736 1727204247.97277: done getting next task for host managed-node2 22736 1727204247.97280: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 22736 1727204247.97283: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204247.97287: getting variables 22736 1727204247.97290: in VariableManager get_vars() 22736 1727204247.97327: Calling all_inventory to load vars for managed-node2 22736 1727204247.97330: Calling groups_inventory to load vars for managed-node2 22736 1727204247.97334: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204247.97345: Calling all_plugins_play to load vars for managed-node2 22736 1727204247.97348: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204247.97352: Calling groups_plugins_play to load vars for managed-node2 22736 1727204247.97503: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204247.97664: done with get_vars() 22736 1727204247.97673: done getting variables 22736 1727204247.97759: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) 22736 1727204247.97863: variable 'interface' from source: set_fact TASK [Assert that the interface is present - 'lsr27'] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Tuesday 24 September 2024 14:57:27 -0400 (0:00:00.400) 0:00:12.763 ***** 22736 1727204247.97890: entering _queue_task() for managed-node2/assert 22736 1727204247.97892: Creating lock for assert 22736 1727204247.98120: worker is 1 (out of 1 available) 22736 1727204247.98134: exiting _queue_task() for managed-node2/assert 22736 1727204247.98147: done queuing things up, now waiting for results queue to drain 22736 1727204247.98148: waiting for pending results... 22736 1727204247.98316: running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'lsr27' 22736 1727204247.98380: in run() - task 12b410aa-8751-4f4a-548a-0000000001d4 22736 1727204247.98397: variable 'ansible_search_path' from source: unknown 22736 1727204247.98401: variable 'ansible_search_path' from source: unknown 22736 1727204247.98433: calling self._execute() 22736 1727204247.98505: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204247.98515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204247.98524: variable 'omit' from source: magic vars 22736 1727204247.98883: variable 'ansible_distribution_major_version' from source: facts 22736 1727204247.98896: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204247.98904: variable 'omit' from source: magic vars 22736 1727204247.98941: variable 'omit' from source: magic vars 22736 1727204247.99024: variable 'interface' from source: set_fact 22736 1727204247.99043: variable 'omit' from source: magic vars 22736 1727204247.99074: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204247.99105: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204247.99123: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204247.99141: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204247.99156: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204247.99182: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204247.99185: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204247.99191: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204247.99277: Set connection var ansible_timeout to 10 22736 1727204247.99287: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204247.99299: Set connection var ansible_shell_executable to /bin/sh 22736 1727204247.99302: Set connection var ansible_shell_type to sh 22736 1727204247.99309: Set connection var ansible_pipelining to False 22736 1727204247.99311: Set connection var ansible_connection to ssh 22736 1727204247.99332: variable 'ansible_shell_executable' from source: unknown 22736 1727204247.99336: variable 'ansible_connection' from source: unknown 22736 1727204247.99338: variable 'ansible_module_compression' from source: unknown 22736 1727204247.99341: variable 'ansible_shell_type' from source: unknown 22736 1727204247.99345: variable 'ansible_shell_executable' from source: unknown 22736 1727204247.99351: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204247.99354: variable 'ansible_pipelining' from source: unknown 22736 1727204247.99359: variable 'ansible_timeout' from source: unknown 22736 1727204247.99370: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204247.99484: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204247.99496: variable 'omit' from source: magic vars 22736 1727204247.99502: starting attempt loop 22736 1727204247.99505: running the handler 22736 1727204247.99618: variable 'interface_stat' from source: set_fact 22736 1727204247.99633: Evaluated conditional (interface_stat.stat.exists): True 22736 1727204247.99641: handler run complete 22736 1727204247.99654: attempt loop complete, returning result 22736 1727204247.99657: _execute() done 22736 1727204247.99661: dumping result to json 22736 1727204247.99666: done dumping result, returning 22736 1727204247.99674: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is present - 'lsr27' [12b410aa-8751-4f4a-548a-0000000001d4] 22736 1727204247.99678: sending task result for task 12b410aa-8751-4f4a-548a-0000000001d4 22736 1727204247.99770: done sending task result for task 12b410aa-8751-4f4a-548a-0000000001d4 22736 1727204247.99774: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 22736 1727204247.99842: no more pending results, returning what we have 22736 1727204247.99845: results queue empty 22736 1727204247.99846: checking for any_errors_fatal 22736 1727204247.99855: done checking for any_errors_fatal 22736 1727204247.99856: checking for max_fail_percentage 22736 1727204247.99857: done checking for max_fail_percentage 22736 1727204247.99858: checking to see if all hosts have failed and the running result is not ok 22736 1727204247.99859: done checking to see if all hosts have failed 22736 1727204247.99861: getting the remaining hosts for this loop 22736 1727204247.99862: done getting the remaining hosts for this loop 22736 1727204247.99865: getting the next task for host managed-node2 22736 1727204247.99873: done getting next task for host managed-node2 22736 1727204247.99875: ^ task is: TASK: meta (flush_handlers) 22736 1727204247.99877: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204247.99881: getting variables 22736 1727204247.99883: in VariableManager get_vars() 22736 1727204247.99911: Calling all_inventory to load vars for managed-node2 22736 1727204247.99916: Calling groups_inventory to load vars for managed-node2 22736 1727204247.99919: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204247.99930: Calling all_plugins_play to load vars for managed-node2 22736 1727204247.99933: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204247.99937: Calling groups_plugins_play to load vars for managed-node2 22736 1727204248.00118: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204248.00272: done with get_vars() 22736 1727204248.00280: done getting variables 22736 1727204248.00335: in VariableManager get_vars() 22736 1727204248.00342: Calling all_inventory to load vars for managed-node2 22736 1727204248.00345: Calling groups_inventory to load vars for managed-node2 22736 1727204248.00347: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204248.00350: Calling all_plugins_play to load vars for managed-node2 22736 1727204248.00352: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204248.00354: Calling groups_plugins_play to load vars for managed-node2 22736 1727204248.00466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204248.00617: done with get_vars() 22736 1727204248.00627: done queuing things up, now waiting for results queue to drain 22736 1727204248.00629: results queue empty 22736 1727204248.00629: checking for any_errors_fatal 22736 1727204248.00631: done checking for any_errors_fatal 22736 1727204248.00632: checking for max_fail_percentage 22736 1727204248.00632: done checking for max_fail_percentage 22736 1727204248.00633: checking to see if all hosts have failed and the running result is not ok 22736 1727204248.00634: done checking to see if all hosts have failed 22736 1727204248.00639: getting the remaining hosts for this loop 22736 1727204248.00639: done getting the remaining hosts for this loop 22736 1727204248.00641: getting the next task for host managed-node2 22736 1727204248.00644: done getting next task for host managed-node2 22736 1727204248.00645: ^ task is: TASK: meta (flush_handlers) 22736 1727204248.00646: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204248.00648: getting variables 22736 1727204248.00649: in VariableManager get_vars() 22736 1727204248.00655: Calling all_inventory to load vars for managed-node2 22736 1727204248.00656: Calling groups_inventory to load vars for managed-node2 22736 1727204248.00658: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204248.00661: Calling all_plugins_play to load vars for managed-node2 22736 1727204248.00663: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204248.00665: Calling groups_plugins_play to load vars for managed-node2 22736 1727204248.00792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204248.00948: done with get_vars() 22736 1727204248.00955: done getting variables 22736 1727204248.00991: in VariableManager get_vars() 22736 1727204248.00997: Calling all_inventory to load vars for managed-node2 22736 1727204248.00999: Calling groups_inventory to load vars for managed-node2 22736 1727204248.01001: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204248.01004: Calling all_plugins_play to load vars for managed-node2 22736 1727204248.01006: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204248.01008: Calling groups_plugins_play to load vars for managed-node2 22736 1727204248.01119: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204248.01268: done with get_vars() 22736 1727204248.01277: done queuing things up, now waiting for results queue to drain 22736 1727204248.01278: results queue empty 22736 1727204248.01279: checking for any_errors_fatal 22736 1727204248.01280: done checking for any_errors_fatal 22736 1727204248.01280: checking for max_fail_percentage 22736 1727204248.01281: done checking for max_fail_percentage 22736 1727204248.01282: checking to see if all hosts have failed and the running result is not ok 22736 1727204248.01282: done checking to see if all hosts have failed 22736 1727204248.01283: getting the remaining hosts for this loop 22736 1727204248.01284: done getting the remaining hosts for this loop 22736 1727204248.01285: getting the next task for host managed-node2 22736 1727204248.01287: done getting next task for host managed-node2 22736 1727204248.01288: ^ task is: None 22736 1727204248.01291: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204248.01292: done queuing things up, now waiting for results queue to drain 22736 1727204248.01293: results queue empty 22736 1727204248.01293: checking for any_errors_fatal 22736 1727204248.01294: done checking for any_errors_fatal 22736 1727204248.01294: checking for max_fail_percentage 22736 1727204248.01295: done checking for max_fail_percentage 22736 1727204248.01295: checking to see if all hosts have failed and the running result is not ok 22736 1727204248.01296: done checking to see if all hosts have failed 22736 1727204248.01297: getting the next task for host managed-node2 22736 1727204248.01299: done getting next task for host managed-node2 22736 1727204248.01299: ^ task is: None 22736 1727204248.01300: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204248.01337: in VariableManager get_vars() 22736 1727204248.01356: done with get_vars() 22736 1727204248.01361: in VariableManager get_vars() 22736 1727204248.01372: done with get_vars() 22736 1727204248.01375: variable 'omit' from source: magic vars 22736 1727204248.01400: in VariableManager get_vars() 22736 1727204248.01410: done with get_vars() 22736 1727204248.01428: variable 'omit' from source: magic vars PLAY [Test static interface up] ************************************************ 22736 1727204248.01960: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22736 1727204248.01982: getting the remaining hosts for this loop 22736 1727204248.01983: done getting the remaining hosts for this loop 22736 1727204248.01986: getting the next task for host managed-node2 22736 1727204248.01988: done getting next task for host managed-node2 22736 1727204248.01991: ^ task is: TASK: Gathering Facts 22736 1727204248.01993: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204248.01994: getting variables 22736 1727204248.01995: in VariableManager get_vars() 22736 1727204248.02003: Calling all_inventory to load vars for managed-node2 22736 1727204248.02005: Calling groups_inventory to load vars for managed-node2 22736 1727204248.02006: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204248.02011: Calling all_plugins_play to load vars for managed-node2 22736 1727204248.02015: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204248.02018: Calling groups_plugins_play to load vars for managed-node2 22736 1727204248.02127: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204248.02275: done with get_vars() 22736 1727204248.02281: done getting variables 22736 1727204248.02315: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:33 Tuesday 24 September 2024 14:57:28 -0400 (0:00:00.044) 0:00:12.808 ***** 22736 1727204248.02334: entering _queue_task() for managed-node2/gather_facts 22736 1727204248.02539: worker is 1 (out of 1 available) 22736 1727204248.02553: exiting _queue_task() for managed-node2/gather_facts 22736 1727204248.02565: done queuing things up, now waiting for results queue to drain 22736 1727204248.02567: waiting for pending results... 22736 1727204248.02728: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22736 1727204248.02793: in run() - task 12b410aa-8751-4f4a-548a-000000000237 22736 1727204248.02809: variable 'ansible_search_path' from source: unknown 22736 1727204248.02842: calling self._execute() 22736 1727204248.02917: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204248.02921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204248.02926: variable 'omit' from source: magic vars 22736 1727204248.03220: variable 'ansible_distribution_major_version' from source: facts 22736 1727204248.03231: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204248.03237: variable 'omit' from source: magic vars 22736 1727204248.03262: variable 'omit' from source: magic vars 22736 1727204248.03293: variable 'omit' from source: magic vars 22736 1727204248.03325: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204248.03356: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204248.03375: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204248.03423: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204248.03434: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204248.03461: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204248.03471: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204248.03478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204248.03562: Set connection var ansible_timeout to 10 22736 1727204248.03576: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204248.03587: Set connection var ansible_shell_executable to /bin/sh 22736 1727204248.03590: Set connection var ansible_shell_type to sh 22736 1727204248.03598: Set connection var ansible_pipelining to False 22736 1727204248.03601: Set connection var ansible_connection to ssh 22736 1727204248.03624: variable 'ansible_shell_executable' from source: unknown 22736 1727204248.03627: variable 'ansible_connection' from source: unknown 22736 1727204248.03630: variable 'ansible_module_compression' from source: unknown 22736 1727204248.03633: variable 'ansible_shell_type' from source: unknown 22736 1727204248.03638: variable 'ansible_shell_executable' from source: unknown 22736 1727204248.03641: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204248.03646: variable 'ansible_pipelining' from source: unknown 22736 1727204248.03649: variable 'ansible_timeout' from source: unknown 22736 1727204248.03654: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204248.03808: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204248.03821: variable 'omit' from source: magic vars 22736 1727204248.03825: starting attempt loop 22736 1727204248.03829: running the handler 22736 1727204248.03843: variable 'ansible_facts' from source: unknown 22736 1727204248.03859: _low_level_execute_command(): starting 22736 1727204248.03867: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204248.04420: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204248.04424: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204248.04427: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204248.04429: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204248.04485: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204248.04499: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204248.04503: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204248.04538: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204248.06321: stdout chunk (state=3): >>>/root <<< 22736 1727204248.06426: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204248.06493: stderr chunk (state=3): >>><<< 22736 1727204248.06497: stdout chunk (state=3): >>><<< 22736 1727204248.06526: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204248.06535: _low_level_execute_command(): starting 22736 1727204248.06542: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204248.0652215-23458-43726239331672 `" && echo ansible-tmp-1727204248.0652215-23458-43726239331672="` echo /root/.ansible/tmp/ansible-tmp-1727204248.0652215-23458-43726239331672 `" ) && sleep 0' 22736 1727204248.07044: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204248.07048: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204248.07051: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204248.07061: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204248.07063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204248.07117: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204248.07120: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204248.07164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204248.09200: stdout chunk (state=3): >>>ansible-tmp-1727204248.0652215-23458-43726239331672=/root/.ansible/tmp/ansible-tmp-1727204248.0652215-23458-43726239331672 <<< 22736 1727204248.09310: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204248.09370: stderr chunk (state=3): >>><<< 22736 1727204248.09375: stdout chunk (state=3): >>><<< 22736 1727204248.09398: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204248.0652215-23458-43726239331672=/root/.ansible/tmp/ansible-tmp-1727204248.0652215-23458-43726239331672 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204248.09432: variable 'ansible_module_compression' from source: unknown 22736 1727204248.09476: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22736 1727204248.09533: variable 'ansible_facts' from source: unknown 22736 1727204248.09657: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204248.0652215-23458-43726239331672/AnsiballZ_setup.py 22736 1727204248.09786: Sending initial data 22736 1727204248.09793: Sent initial data (153 bytes) 22736 1727204248.10285: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204248.10288: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204248.10294: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204248.10297: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204248.10354: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204248.10358: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204248.10406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204248.12078: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204248.12115: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204248.12148: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpmxjozsfi /root/.ansible/tmp/ansible-tmp-1727204248.0652215-23458-43726239331672/AnsiballZ_setup.py <<< 22736 1727204248.12156: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204248.0652215-23458-43726239331672/AnsiballZ_setup.py" <<< 22736 1727204248.12185: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpmxjozsfi" to remote "/root/.ansible/tmp/ansible-tmp-1727204248.0652215-23458-43726239331672/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204248.0652215-23458-43726239331672/AnsiballZ_setup.py" <<< 22736 1727204248.13862: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204248.13935: stderr chunk (state=3): >>><<< 22736 1727204248.13939: stdout chunk (state=3): >>><<< 22736 1727204248.13966: done transferring module to remote 22736 1727204248.13995: _low_level_execute_command(): starting 22736 1727204248.13999: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204248.0652215-23458-43726239331672/ /root/.ansible/tmp/ansible-tmp-1727204248.0652215-23458-43726239331672/AnsiballZ_setup.py && sleep 0' 22736 1727204248.14473: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204248.14477: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204248.14479: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 22736 1727204248.14481: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204248.14484: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204248.14533: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204248.14551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204248.14586: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204248.16552: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204248.16610: stderr chunk (state=3): >>><<< 22736 1727204248.16616: stdout chunk (state=3): >>><<< 22736 1727204248.16629: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204248.16633: _low_level_execute_command(): starting 22736 1727204248.16639: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204248.0652215-23458-43726239331672/AnsiballZ_setup.py && sleep 0' 22736 1727204248.17123: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204248.17127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204248.17139: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204248.17196: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204248.17202: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204248.17259: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204248.88188: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_loadavg": {"1m": 0.943359375, "5m": 0.67041015625, "15m": 0.4072265625}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_te<<< 22736 1727204248.88196: stdout chunk (state=3): >>>ch_host": [], "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "28", "epoch": "1727204248", "epoch_int": "1727204248", "date": "2024-09-24", "time": "14:57:28", "iso8601_micro": "2024-09-24T18:57:28.487919Z", "iso8601": "2024-09-24T18:57:28Z", "iso8601_basic": "20240924T145728487919", "iso8601_basic_short": "20240924T145728", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "", "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2749, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 968, "free": 2749}, "nocache": {"free": 3379, "used": 338}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 752, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146989568, "block_size": 4096, "block_total": 64479564, "block_available": 61315183, "block_used": 3164381, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["peerlsr27", "lsr27", "eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "1a:7c:88:07:95:bb", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::7807:358f:2c9b:b2cc", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "02:23:6d:58:c2:b5", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::23:6dff:fe58:c2b5", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8", "fe80::7807:358f:2c9b:b2cc", "fe80::23:6dff:fe58:c2b5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::23:6dff:fe58:c2b5", "fe80::4a44:1e77:128f:34e8", "fe80::7807:358f:2c9b:b2cc"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22736 1727204248.90333: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204248.90399: stderr chunk (state=3): >>><<< 22736 1727204248.90404: stdout chunk (state=3): >>><<< 22736 1727204248.90433: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_loadavg": {"1m": 0.943359375, "5m": 0.67041015625, "15m": 0.4072265625}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "28", "epoch": "1727204248", "epoch_int": "1727204248", "date": "2024-09-24", "time": "14:57:28", "iso8601_micro": "2024-09-24T18:57:28.487919Z", "iso8601": "2024-09-24T18:57:28Z", "iso8601_basic": "20240924T145728487919", "iso8601_basic_short": "20240924T145728", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_hostnqn": "", "ansible_fips": false, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_iscsi_iqn": "", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2749, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 968, "free": 2749}, "nocache": {"free": 3379, "used": 338}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 752, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146989568, "block_size": 4096, "block_total": 64479564, "block_available": 61315183, "block_used": 3164381, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["peerlsr27", "lsr27", "eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "1a:7c:88:07:95:bb", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::7807:358f:2c9b:b2cc", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "02:23:6d:58:c2:b5", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::23:6dff:fe58:c2b5", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8", "fe80::7807:358f:2c9b:b2cc", "fe80::23:6dff:fe58:c2b5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::23:6dff:fe58:c2b5", "fe80::4a44:1e77:128f:34e8", "fe80::7807:358f:2c9b:b2cc"]}, "ansible_pkg_mgr": "dnf", "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204248.91862: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204248.0652215-23458-43726239331672/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204248.91865: _low_level_execute_command(): starting 22736 1727204248.91866: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204248.0652215-23458-43726239331672/ > /dev/null 2>&1 && sleep 0' 22736 1727204248.91868: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204248.91869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204248.91871: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204248.91872: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204248.91874: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204248.91875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204248.91876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204248.91877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204248.91879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204248.93296: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204248.93355: stderr chunk (state=3): >>><<< 22736 1727204248.93359: stdout chunk (state=3): >>><<< 22736 1727204248.93374: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204248.93382: handler run complete 22736 1727204248.93504: variable 'ansible_facts' from source: unknown 22736 1727204248.93595: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204248.93865: variable 'ansible_facts' from source: unknown 22736 1727204248.93943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204248.94059: attempt loop complete, returning result 22736 1727204248.94062: _execute() done 22736 1727204248.94068: dumping result to json 22736 1727204248.94093: done dumping result, returning 22736 1727204248.94194: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-4f4a-548a-000000000237] 22736 1727204248.94198: sending task result for task 12b410aa-8751-4f4a-548a-000000000237 ok: [managed-node2] 22736 1727204248.94675: no more pending results, returning what we have 22736 1727204248.94678: results queue empty 22736 1727204248.94679: checking for any_errors_fatal 22736 1727204248.94680: done checking for any_errors_fatal 22736 1727204248.94680: checking for max_fail_percentage 22736 1727204248.94682: done checking for max_fail_percentage 22736 1727204248.94682: checking to see if all hosts have failed and the running result is not ok 22736 1727204248.94683: done checking to see if all hosts have failed 22736 1727204248.94684: getting the remaining hosts for this loop 22736 1727204248.94685: done getting the remaining hosts for this loop 22736 1727204248.94687: getting the next task for host managed-node2 22736 1727204248.94693: done getting next task for host managed-node2 22736 1727204248.94695: ^ task is: TASK: meta (flush_handlers) 22736 1727204248.94696: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204248.94699: getting variables 22736 1727204248.94700: in VariableManager get_vars() 22736 1727204248.94727: Calling all_inventory to load vars for managed-node2 22736 1727204248.94730: Calling groups_inventory to load vars for managed-node2 22736 1727204248.94731: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204248.94741: Calling all_plugins_play to load vars for managed-node2 22736 1727204248.94744: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204248.94748: Calling groups_plugins_play to load vars for managed-node2 22736 1727204248.94890: done sending task result for task 12b410aa-8751-4f4a-548a-000000000237 22736 1727204248.94894: WORKER PROCESS EXITING 22736 1727204248.94907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204248.95073: done with get_vars() 22736 1727204248.95082: done getting variables 22736 1727204248.95141: in VariableManager get_vars() 22736 1727204248.95151: Calling all_inventory to load vars for managed-node2 22736 1727204248.95153: Calling groups_inventory to load vars for managed-node2 22736 1727204248.95155: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204248.95160: Calling all_plugins_play to load vars for managed-node2 22736 1727204248.95162: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204248.95164: Calling groups_plugins_play to load vars for managed-node2 22736 1727204248.95282: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204248.95440: done with get_vars() 22736 1727204248.95451: done queuing things up, now waiting for results queue to drain 22736 1727204248.95452: results queue empty 22736 1727204248.95453: checking for any_errors_fatal 22736 1727204248.95456: done checking for any_errors_fatal 22736 1727204248.95456: checking for max_fail_percentage 22736 1727204248.95457: done checking for max_fail_percentage 22736 1727204248.95461: checking to see if all hosts have failed and the running result is not ok 22736 1727204248.95462: done checking to see if all hosts have failed 22736 1727204248.95463: getting the remaining hosts for this loop 22736 1727204248.95463: done getting the remaining hosts for this loop 22736 1727204248.95465: getting the next task for host managed-node2 22736 1727204248.95468: done getting next task for host managed-node2 22736 1727204248.95470: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22736 1727204248.95471: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204248.95479: getting variables 22736 1727204248.95480: in VariableManager get_vars() 22736 1727204248.95495: Calling all_inventory to load vars for managed-node2 22736 1727204248.95496: Calling groups_inventory to load vars for managed-node2 22736 1727204248.95498: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204248.95501: Calling all_plugins_play to load vars for managed-node2 22736 1727204248.95503: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204248.95505: Calling groups_plugins_play to load vars for managed-node2 22736 1727204248.95623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204248.95968: done with get_vars() 22736 1727204248.95975: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:28 -0400 (0:00:00.936) 0:00:13.745 ***** 22736 1727204248.96037: entering _queue_task() for managed-node2/include_tasks 22736 1727204248.96269: worker is 1 (out of 1 available) 22736 1727204248.96284: exiting _queue_task() for managed-node2/include_tasks 22736 1727204248.96299: done queuing things up, now waiting for results queue to drain 22736 1727204248.96300: waiting for pending results... 22736 1727204248.96472: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22736 1727204248.96552: in run() - task 12b410aa-8751-4f4a-548a-000000000019 22736 1727204248.96566: variable 'ansible_search_path' from source: unknown 22736 1727204248.96569: variable 'ansible_search_path' from source: unknown 22736 1727204248.96606: calling self._execute() 22736 1727204248.96678: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204248.96685: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204248.96696: variable 'omit' from source: magic vars 22736 1727204248.97021: variable 'ansible_distribution_major_version' from source: facts 22736 1727204248.97032: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204248.97038: _execute() done 22736 1727204248.97042: dumping result to json 22736 1727204248.97047: done dumping result, returning 22736 1727204248.97055: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-4f4a-548a-000000000019] 22736 1727204248.97060: sending task result for task 12b410aa-8751-4f4a-548a-000000000019 22736 1727204248.97161: done sending task result for task 12b410aa-8751-4f4a-548a-000000000019 22736 1727204248.97164: WORKER PROCESS EXITING 22736 1727204248.97224: no more pending results, returning what we have 22736 1727204248.97229: in VariableManager get_vars() 22736 1727204248.97268: Calling all_inventory to load vars for managed-node2 22736 1727204248.97270: Calling groups_inventory to load vars for managed-node2 22736 1727204248.97273: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204248.97282: Calling all_plugins_play to load vars for managed-node2 22736 1727204248.97285: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204248.97288: Calling groups_plugins_play to load vars for managed-node2 22736 1727204248.97448: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204248.97619: done with get_vars() 22736 1727204248.97626: variable 'ansible_search_path' from source: unknown 22736 1727204248.97627: variable 'ansible_search_path' from source: unknown 22736 1727204248.97651: we have included files to process 22736 1727204248.97652: generating all_blocks data 22736 1727204248.97653: done generating all_blocks data 22736 1727204248.97654: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22736 1727204248.97654: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22736 1727204248.97656: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22736 1727204248.98248: done processing included file 22736 1727204248.98250: iterating over new_blocks loaded from include file 22736 1727204248.98251: in VariableManager get_vars() 22736 1727204248.98266: done with get_vars() 22736 1727204248.98268: filtering new block on tags 22736 1727204248.98280: done filtering new block on tags 22736 1727204248.98282: in VariableManager get_vars() 22736 1727204248.98298: done with get_vars() 22736 1727204248.98299: filtering new block on tags 22736 1727204248.98317: done filtering new block on tags 22736 1727204248.98320: in VariableManager get_vars() 22736 1727204248.98334: done with get_vars() 22736 1727204248.98336: filtering new block on tags 22736 1727204248.98348: done filtering new block on tags 22736 1727204248.98350: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 22736 1727204248.98353: extending task lists for all hosts with included blocks 22736 1727204248.98656: done extending task lists 22736 1727204248.98657: done processing included files 22736 1727204248.98658: results queue empty 22736 1727204248.98658: checking for any_errors_fatal 22736 1727204248.98660: done checking for any_errors_fatal 22736 1727204248.98660: checking for max_fail_percentage 22736 1727204248.98661: done checking for max_fail_percentage 22736 1727204248.98662: checking to see if all hosts have failed and the running result is not ok 22736 1727204248.98662: done checking to see if all hosts have failed 22736 1727204248.98663: getting the remaining hosts for this loop 22736 1727204248.98664: done getting the remaining hosts for this loop 22736 1727204248.98666: getting the next task for host managed-node2 22736 1727204248.98669: done getting next task for host managed-node2 22736 1727204248.98670: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22736 1727204248.98672: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204248.98679: getting variables 22736 1727204248.98680: in VariableManager get_vars() 22736 1727204248.98692: Calling all_inventory to load vars for managed-node2 22736 1727204248.98693: Calling groups_inventory to load vars for managed-node2 22736 1727204248.98695: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204248.98698: Calling all_plugins_play to load vars for managed-node2 22736 1727204248.98700: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204248.98702: Calling groups_plugins_play to load vars for managed-node2 22736 1727204248.98834: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204248.99000: done with get_vars() 22736 1727204248.99007: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:57:28 -0400 (0:00:00.030) 0:00:13.775 ***** 22736 1727204248.99061: entering _queue_task() for managed-node2/setup 22736 1727204248.99276: worker is 1 (out of 1 available) 22736 1727204248.99292: exiting _queue_task() for managed-node2/setup 22736 1727204248.99306: done queuing things up, now waiting for results queue to drain 22736 1727204248.99307: waiting for pending results... 22736 1727204248.99471: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22736 1727204248.99562: in run() - task 12b410aa-8751-4f4a-548a-000000000279 22736 1727204248.99574: variable 'ansible_search_path' from source: unknown 22736 1727204248.99577: variable 'ansible_search_path' from source: unknown 22736 1727204248.99615: calling self._execute() 22736 1727204248.99682: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204248.99688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204248.99701: variable 'omit' from source: magic vars 22736 1727204249.00003: variable 'ansible_distribution_major_version' from source: facts 22736 1727204249.00017: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204249.00192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204249.01906: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204249.01973: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204249.02006: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204249.02042: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204249.02065: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204249.02137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204249.02165: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204249.02187: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204249.02224: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204249.02237: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204249.02287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204249.02308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204249.02331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204249.02365: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204249.02381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204249.02513: variable '__network_required_facts' from source: role '' defaults 22736 1727204249.02523: variable 'ansible_facts' from source: unknown 22736 1727204249.02605: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 22736 1727204249.02609: when evaluation is False, skipping this task 22736 1727204249.02612: _execute() done 22736 1727204249.02615: dumping result to json 22736 1727204249.02619: done dumping result, returning 22736 1727204249.02627: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-4f4a-548a-000000000279] 22736 1727204249.02633: sending task result for task 12b410aa-8751-4f4a-548a-000000000279 22736 1727204249.02722: done sending task result for task 12b410aa-8751-4f4a-548a-000000000279 22736 1727204249.02725: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22736 1727204249.02778: no more pending results, returning what we have 22736 1727204249.02781: results queue empty 22736 1727204249.02783: checking for any_errors_fatal 22736 1727204249.02785: done checking for any_errors_fatal 22736 1727204249.02786: checking for max_fail_percentage 22736 1727204249.02787: done checking for max_fail_percentage 22736 1727204249.02788: checking to see if all hosts have failed and the running result is not ok 22736 1727204249.02791: done checking to see if all hosts have failed 22736 1727204249.02792: getting the remaining hosts for this loop 22736 1727204249.02793: done getting the remaining hosts for this loop 22736 1727204249.02798: getting the next task for host managed-node2 22736 1727204249.02808: done getting next task for host managed-node2 22736 1727204249.02812: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 22736 1727204249.02815: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204249.02831: getting variables 22736 1727204249.02832: in VariableManager get_vars() 22736 1727204249.02880: Calling all_inventory to load vars for managed-node2 22736 1727204249.02884: Calling groups_inventory to load vars for managed-node2 22736 1727204249.02887: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204249.02899: Calling all_plugins_play to load vars for managed-node2 22736 1727204249.02902: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204249.02905: Calling groups_plugins_play to load vars for managed-node2 22736 1727204249.03083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204249.03267: done with get_vars() 22736 1727204249.03278: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:57:29 -0400 (0:00:00.042) 0:00:13.818 ***** 22736 1727204249.03354: entering _queue_task() for managed-node2/stat 22736 1727204249.03574: worker is 1 (out of 1 available) 22736 1727204249.03591: exiting _queue_task() for managed-node2/stat 22736 1727204249.03605: done queuing things up, now waiting for results queue to drain 22736 1727204249.03606: waiting for pending results... 22736 1727204249.03781: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 22736 1727204249.03875: in run() - task 12b410aa-8751-4f4a-548a-00000000027b 22736 1727204249.03888: variable 'ansible_search_path' from source: unknown 22736 1727204249.03894: variable 'ansible_search_path' from source: unknown 22736 1727204249.03931: calling self._execute() 22736 1727204249.04002: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204249.04009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204249.04021: variable 'omit' from source: magic vars 22736 1727204249.04331: variable 'ansible_distribution_major_version' from source: facts 22736 1727204249.04342: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204249.04555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204249.04774: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204249.04814: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204249.04847: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204249.04877: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204249.04956: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204249.04976: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204249.05001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204249.05025: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204249.05101: variable '__network_is_ostree' from source: set_fact 22736 1727204249.05108: Evaluated conditional (not __network_is_ostree is defined): False 22736 1727204249.05111: when evaluation is False, skipping this task 22736 1727204249.05118: _execute() done 22736 1727204249.05123: dumping result to json 22736 1727204249.05126: done dumping result, returning 22736 1727204249.05134: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-4f4a-548a-00000000027b] 22736 1727204249.05138: sending task result for task 12b410aa-8751-4f4a-548a-00000000027b 22736 1727204249.05229: done sending task result for task 12b410aa-8751-4f4a-548a-00000000027b 22736 1727204249.05232: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22736 1727204249.05315: no more pending results, returning what we have 22736 1727204249.05318: results queue empty 22736 1727204249.05320: checking for any_errors_fatal 22736 1727204249.05325: done checking for any_errors_fatal 22736 1727204249.05326: checking for max_fail_percentage 22736 1727204249.05327: done checking for max_fail_percentage 22736 1727204249.05328: checking to see if all hosts have failed and the running result is not ok 22736 1727204249.05330: done checking to see if all hosts have failed 22736 1727204249.05331: getting the remaining hosts for this loop 22736 1727204249.05332: done getting the remaining hosts for this loop 22736 1727204249.05335: getting the next task for host managed-node2 22736 1727204249.05341: done getting next task for host managed-node2 22736 1727204249.05345: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22736 1727204249.05348: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204249.05364: getting variables 22736 1727204249.05365: in VariableManager get_vars() 22736 1727204249.05407: Calling all_inventory to load vars for managed-node2 22736 1727204249.05410: Calling groups_inventory to load vars for managed-node2 22736 1727204249.05412: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204249.05421: Calling all_plugins_play to load vars for managed-node2 22736 1727204249.05424: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204249.05426: Calling groups_plugins_play to load vars for managed-node2 22736 1727204249.05606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204249.05779: done with get_vars() 22736 1727204249.05788: done getting variables 22736 1727204249.05837: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:57:29 -0400 (0:00:00.025) 0:00:13.843 ***** 22736 1727204249.05863: entering _queue_task() for managed-node2/set_fact 22736 1727204249.06072: worker is 1 (out of 1 available) 22736 1727204249.06086: exiting _queue_task() for managed-node2/set_fact 22736 1727204249.06102: done queuing things up, now waiting for results queue to drain 22736 1727204249.06104: waiting for pending results... 22736 1727204249.06278: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22736 1727204249.06373: in run() - task 12b410aa-8751-4f4a-548a-00000000027c 22736 1727204249.06386: variable 'ansible_search_path' from source: unknown 22736 1727204249.06391: variable 'ansible_search_path' from source: unknown 22736 1727204249.06423: calling self._execute() 22736 1727204249.06497: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204249.06503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204249.06515: variable 'omit' from source: magic vars 22736 1727204249.06821: variable 'ansible_distribution_major_version' from source: facts 22736 1727204249.06832: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204249.06971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204249.07193: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204249.07234: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204249.07265: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204249.07298: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204249.07402: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204249.07427: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204249.07452: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204249.07473: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204249.07551: variable '__network_is_ostree' from source: set_fact 22736 1727204249.07558: Evaluated conditional (not __network_is_ostree is defined): False 22736 1727204249.07561: when evaluation is False, skipping this task 22736 1727204249.07564: _execute() done 22736 1727204249.07570: dumping result to json 22736 1727204249.07572: done dumping result, returning 22736 1727204249.07581: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-4f4a-548a-00000000027c] 22736 1727204249.07586: sending task result for task 12b410aa-8751-4f4a-548a-00000000027c 22736 1727204249.07674: done sending task result for task 12b410aa-8751-4f4a-548a-00000000027c 22736 1727204249.07678: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22736 1727204249.07732: no more pending results, returning what we have 22736 1727204249.07736: results queue empty 22736 1727204249.07738: checking for any_errors_fatal 22736 1727204249.07743: done checking for any_errors_fatal 22736 1727204249.07744: checking for max_fail_percentage 22736 1727204249.07746: done checking for max_fail_percentage 22736 1727204249.07747: checking to see if all hosts have failed and the running result is not ok 22736 1727204249.07748: done checking to see if all hosts have failed 22736 1727204249.07749: getting the remaining hosts for this loop 22736 1727204249.07750: done getting the remaining hosts for this loop 22736 1727204249.07755: getting the next task for host managed-node2 22736 1727204249.07764: done getting next task for host managed-node2 22736 1727204249.07768: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 22736 1727204249.07771: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204249.07786: getting variables 22736 1727204249.07787: in VariableManager get_vars() 22736 1727204249.07834: Calling all_inventory to load vars for managed-node2 22736 1727204249.07837: Calling groups_inventory to load vars for managed-node2 22736 1727204249.07840: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204249.07850: Calling all_plugins_play to load vars for managed-node2 22736 1727204249.07853: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204249.07856: Calling groups_plugins_play to load vars for managed-node2 22736 1727204249.08016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204249.08209: done with get_vars() 22736 1727204249.08218: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:57:29 -0400 (0:00:00.024) 0:00:13.867 ***** 22736 1727204249.08294: entering _queue_task() for managed-node2/service_facts 22736 1727204249.08295: Creating lock for service_facts 22736 1727204249.08506: worker is 1 (out of 1 available) 22736 1727204249.08525: exiting _queue_task() for managed-node2/service_facts 22736 1727204249.08539: done queuing things up, now waiting for results queue to drain 22736 1727204249.08540: waiting for pending results... 22736 1727204249.08706: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 22736 1727204249.08792: in run() - task 12b410aa-8751-4f4a-548a-00000000027e 22736 1727204249.08805: variable 'ansible_search_path' from source: unknown 22736 1727204249.08809: variable 'ansible_search_path' from source: unknown 22736 1727204249.08841: calling self._execute() 22736 1727204249.08918: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204249.08921: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204249.08931: variable 'omit' from source: magic vars 22736 1727204249.09232: variable 'ansible_distribution_major_version' from source: facts 22736 1727204249.09243: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204249.09250: variable 'omit' from source: magic vars 22736 1727204249.09294: variable 'omit' from source: magic vars 22736 1727204249.09331: variable 'omit' from source: magic vars 22736 1727204249.09359: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204249.09391: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204249.09409: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204249.09427: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204249.09440: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204249.09467: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204249.09470: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204249.09475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204249.09560: Set connection var ansible_timeout to 10 22736 1727204249.09571: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204249.09580: Set connection var ansible_shell_executable to /bin/sh 22736 1727204249.09583: Set connection var ansible_shell_type to sh 22736 1727204249.09591: Set connection var ansible_pipelining to False 22736 1727204249.09594: Set connection var ansible_connection to ssh 22736 1727204249.09615: variable 'ansible_shell_executable' from source: unknown 22736 1727204249.09618: variable 'ansible_connection' from source: unknown 22736 1727204249.09621: variable 'ansible_module_compression' from source: unknown 22736 1727204249.09624: variable 'ansible_shell_type' from source: unknown 22736 1727204249.09626: variable 'ansible_shell_executable' from source: unknown 22736 1727204249.09628: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204249.09636: variable 'ansible_pipelining' from source: unknown 22736 1727204249.09639: variable 'ansible_timeout' from source: unknown 22736 1727204249.09644: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204249.09808: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204249.09818: variable 'omit' from source: magic vars 22736 1727204249.09822: starting attempt loop 22736 1727204249.09827: running the handler 22736 1727204249.09841: _low_level_execute_command(): starting 22736 1727204249.09848: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204249.10400: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204249.10404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204249.10407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204249.10410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204249.10466: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204249.10469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204249.10475: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204249.10520: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204249.12331: stdout chunk (state=3): >>>/root <<< 22736 1727204249.12441: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204249.12505: stderr chunk (state=3): >>><<< 22736 1727204249.12509: stdout chunk (state=3): >>><<< 22736 1727204249.12531: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204249.12543: _low_level_execute_command(): starting 22736 1727204249.12551: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204249.1253085-23477-167622380605741 `" && echo ansible-tmp-1727204249.1253085-23477-167622380605741="` echo /root/.ansible/tmp/ansible-tmp-1727204249.1253085-23477-167622380605741 `" ) && sleep 0' 22736 1727204249.13048: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204249.13051: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204249.13054: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204249.13064: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204249.13067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204249.13113: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204249.13117: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204249.13168: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204249.15324: stdout chunk (state=3): >>>ansible-tmp-1727204249.1253085-23477-167622380605741=/root/.ansible/tmp/ansible-tmp-1727204249.1253085-23477-167622380605741 <<< 22736 1727204249.15444: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204249.15501: stderr chunk (state=3): >>><<< 22736 1727204249.15504: stdout chunk (state=3): >>><<< 22736 1727204249.15524: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204249.1253085-23477-167622380605741=/root/.ansible/tmp/ansible-tmp-1727204249.1253085-23477-167622380605741 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204249.15573: variable 'ansible_module_compression' from source: unknown 22736 1727204249.15614: ANSIBALLZ: Using lock for service_facts 22736 1727204249.15619: ANSIBALLZ: Acquiring lock 22736 1727204249.15622: ANSIBALLZ: Lock acquired: 140553532308416 22736 1727204249.15628: ANSIBALLZ: Creating module 22736 1727204249.26841: ANSIBALLZ: Writing module into payload 22736 1727204249.26930: ANSIBALLZ: Writing module 22736 1727204249.26948: ANSIBALLZ: Renaming module 22736 1727204249.26955: ANSIBALLZ: Done creating module 22736 1727204249.26970: variable 'ansible_facts' from source: unknown 22736 1727204249.27026: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204249.1253085-23477-167622380605741/AnsiballZ_service_facts.py 22736 1727204249.27145: Sending initial data 22736 1727204249.27148: Sent initial data (162 bytes) 22736 1727204249.27636: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204249.27640: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204249.27642: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 22736 1727204249.27645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204249.27647: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204249.27696: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204249.27721: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204249.27728: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204249.27761: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204249.29516: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 22736 1727204249.29526: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204249.29558: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204249.29600: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp0n5c6q9q /root/.ansible/tmp/ansible-tmp-1727204249.1253085-23477-167622380605741/AnsiballZ_service_facts.py <<< 22736 1727204249.29604: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204249.1253085-23477-167622380605741/AnsiballZ_service_facts.py" <<< 22736 1727204249.29636: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp0n5c6q9q" to remote "/root/.ansible/tmp/ansible-tmp-1727204249.1253085-23477-167622380605741/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204249.1253085-23477-167622380605741/AnsiballZ_service_facts.py" <<< 22736 1727204249.30440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204249.30517: stderr chunk (state=3): >>><<< 22736 1727204249.30520: stdout chunk (state=3): >>><<< 22736 1727204249.30540: done transferring module to remote 22736 1727204249.30552: _low_level_execute_command(): starting 22736 1727204249.30559: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204249.1253085-23477-167622380605741/ /root/.ansible/tmp/ansible-tmp-1727204249.1253085-23477-167622380605741/AnsiballZ_service_facts.py && sleep 0' 22736 1727204249.31042: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204249.31046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204249.31048: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204249.31050: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204249.31056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204249.31112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204249.31118: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204249.31158: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204249.33167: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204249.33221: stderr chunk (state=3): >>><<< 22736 1727204249.33225: stdout chunk (state=3): >>><<< 22736 1727204249.33240: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204249.33243: _low_level_execute_command(): starting 22736 1727204249.33249: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204249.1253085-23477-167622380605741/AnsiballZ_service_facts.py && sleep 0' 22736 1727204249.33724: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204249.33728: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204249.33731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 22736 1727204249.33733: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204249.33735: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204249.33793: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204249.33797: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204249.33845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204251.37839: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "seria<<< 22736 1727204251.37900: stdout chunk (state=3): >>>l-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd<<< 22736 1727204251.37928: stdout chunk (state=3): >>>"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 22736 1727204251.39697: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204251.39719: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 22736 1727204251.39766: stderr chunk (state=3): >>><<< 22736 1727204251.39780: stdout chunk (state=3): >>><<< 22736 1727204251.39821: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204251.41037: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204249.1253085-23477-167622380605741/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204251.41056: _low_level_execute_command(): starting 22736 1727204251.41068: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204249.1253085-23477-167622380605741/ > /dev/null 2>&1 && sleep 0' 22736 1727204251.42004: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204251.42054: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204251.42067: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204251.42121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204251.44291: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204251.44317: stdout chunk (state=3): >>><<< 22736 1727204251.44331: stderr chunk (state=3): >>><<< 22736 1727204251.44354: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204251.44369: handler run complete 22736 1727204251.44896: variable 'ansible_facts' from source: unknown 22736 1727204251.44953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204251.45746: variable 'ansible_facts' from source: unknown 22736 1727204251.47466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204251.47865: attempt loop complete, returning result 22736 1727204251.47884: _execute() done 22736 1727204251.47896: dumping result to json 22736 1727204251.47994: done dumping result, returning 22736 1727204251.48011: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-4f4a-548a-00000000027e] 22736 1727204251.48025: sending task result for task 12b410aa-8751-4f4a-548a-00000000027e 22736 1727204251.49311: done sending task result for task 12b410aa-8751-4f4a-548a-00000000027e ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22736 1727204251.49469: no more pending results, returning what we have 22736 1727204251.49473: results queue empty 22736 1727204251.49474: checking for any_errors_fatal 22736 1727204251.49480: done checking for any_errors_fatal 22736 1727204251.49481: checking for max_fail_percentage 22736 1727204251.49483: done checking for max_fail_percentage 22736 1727204251.49484: checking to see if all hosts have failed and the running result is not ok 22736 1727204251.49486: done checking to see if all hosts have failed 22736 1727204251.49487: getting the remaining hosts for this loop 22736 1727204251.49490: done getting the remaining hosts for this loop 22736 1727204251.49495: getting the next task for host managed-node2 22736 1727204251.49502: done getting next task for host managed-node2 22736 1727204251.49507: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 22736 1727204251.49511: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204251.49527: getting variables 22736 1727204251.49529: in VariableManager get_vars() 22736 1727204251.49574: Calling all_inventory to load vars for managed-node2 22736 1727204251.49578: Calling groups_inventory to load vars for managed-node2 22736 1727204251.49580: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204251.49794: Calling all_plugins_play to load vars for managed-node2 22736 1727204251.49799: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204251.49806: WORKER PROCESS EXITING 22736 1727204251.49820: Calling groups_plugins_play to load vars for managed-node2 22736 1727204251.50477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204251.51209: done with get_vars() 22736 1727204251.51237: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:57:31 -0400 (0:00:02.430) 0:00:16.298 ***** 22736 1727204251.51366: entering _queue_task() for managed-node2/package_facts 22736 1727204251.51368: Creating lock for package_facts 22736 1727204251.51755: worker is 1 (out of 1 available) 22736 1727204251.51883: exiting _queue_task() for managed-node2/package_facts 22736 1727204251.51896: done queuing things up, now waiting for results queue to drain 22736 1727204251.51897: waiting for pending results... 22736 1727204251.52106: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 22736 1727204251.52263: in run() - task 12b410aa-8751-4f4a-548a-00000000027f 22736 1727204251.52286: variable 'ansible_search_path' from source: unknown 22736 1727204251.52298: variable 'ansible_search_path' from source: unknown 22736 1727204251.52354: calling self._execute() 22736 1727204251.52463: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204251.52478: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204251.52497: variable 'omit' from source: magic vars 22736 1727204251.52951: variable 'ansible_distribution_major_version' from source: facts 22736 1727204251.52977: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204251.52991: variable 'omit' from source: magic vars 22736 1727204251.53189: variable 'omit' from source: magic vars 22736 1727204251.53194: variable 'omit' from source: magic vars 22736 1727204251.53197: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204251.53239: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204251.53270: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204251.53308: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204251.53336: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204251.53379: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204251.53391: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204251.53406: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204251.53563: Set connection var ansible_timeout to 10 22736 1727204251.53585: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204251.53604: Set connection var ansible_shell_executable to /bin/sh 22736 1727204251.53614: Set connection var ansible_shell_type to sh 22736 1727204251.53643: Set connection var ansible_pipelining to False 22736 1727204251.53646: Set connection var ansible_connection to ssh 22736 1727204251.53735: variable 'ansible_shell_executable' from source: unknown 22736 1727204251.53738: variable 'ansible_connection' from source: unknown 22736 1727204251.53741: variable 'ansible_module_compression' from source: unknown 22736 1727204251.53743: variable 'ansible_shell_type' from source: unknown 22736 1727204251.53747: variable 'ansible_shell_executable' from source: unknown 22736 1727204251.53752: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204251.53754: variable 'ansible_pipelining' from source: unknown 22736 1727204251.53757: variable 'ansible_timeout' from source: unknown 22736 1727204251.53759: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204251.54026: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204251.54049: variable 'omit' from source: magic vars 22736 1727204251.54067: starting attempt loop 22736 1727204251.54083: running the handler 22736 1727204251.54107: _low_level_execute_command(): starting 22736 1727204251.54171: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204251.55003: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204251.55059: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204251.55083: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22736 1727204251.55173: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204251.55217: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204251.55240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204251.55272: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204251.55376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204251.57171: stdout chunk (state=3): >>>/root <<< 22736 1727204251.57393: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204251.57397: stdout chunk (state=3): >>><<< 22736 1727204251.57400: stderr chunk (state=3): >>><<< 22736 1727204251.57540: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204251.57545: _low_level_execute_command(): starting 22736 1727204251.57549: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204251.5742528-23516-108513614984878 `" && echo ansible-tmp-1727204251.5742528-23516-108513614984878="` echo /root/.ansible/tmp/ansible-tmp-1727204251.5742528-23516-108513614984878 `" ) && sleep 0' 22736 1727204251.58175: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204251.58323: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204251.58327: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204251.58359: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204251.58378: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204251.58404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204251.58484: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204251.60544: stdout chunk (state=3): >>>ansible-tmp-1727204251.5742528-23516-108513614984878=/root/.ansible/tmp/ansible-tmp-1727204251.5742528-23516-108513614984878 <<< 22736 1727204251.60896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204251.60900: stdout chunk (state=3): >>><<< 22736 1727204251.60903: stderr chunk (state=3): >>><<< 22736 1727204251.60906: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204251.5742528-23516-108513614984878=/root/.ansible/tmp/ansible-tmp-1727204251.5742528-23516-108513614984878 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204251.60909: variable 'ansible_module_compression' from source: unknown 22736 1727204251.60928: ANSIBALLZ: Using lock for package_facts 22736 1727204251.60936: ANSIBALLZ: Acquiring lock 22736 1727204251.60945: ANSIBALLZ: Lock acquired: 140553532305296 22736 1727204251.60953: ANSIBALLZ: Creating module 22736 1727204252.05665: ANSIBALLZ: Writing module into payload 22736 1727204252.05877: ANSIBALLZ: Writing module 22736 1727204252.05929: ANSIBALLZ: Renaming module 22736 1727204252.06000: ANSIBALLZ: Done creating module 22736 1727204252.06003: variable 'ansible_facts' from source: unknown 22736 1727204252.06239: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204251.5742528-23516-108513614984878/AnsiballZ_package_facts.py 22736 1727204252.06466: Sending initial data 22736 1727204252.06470: Sent initial data (162 bytes) 22736 1727204252.07209: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204252.07270: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204252.07295: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204252.07333: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204252.07407: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204252.09158: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204252.09228: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204252.09259: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmppp9eihv3 /root/.ansible/tmp/ansible-tmp-1727204251.5742528-23516-108513614984878/AnsiballZ_package_facts.py <<< 22736 1727204252.09263: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204251.5742528-23516-108513614984878/AnsiballZ_package_facts.py" <<< 22736 1727204252.09311: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmppp9eihv3" to remote "/root/.ansible/tmp/ansible-tmp-1727204251.5742528-23516-108513614984878/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204251.5742528-23516-108513614984878/AnsiballZ_package_facts.py" <<< 22736 1727204252.11812: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204252.12000: stderr chunk (state=3): >>><<< 22736 1727204252.12004: stdout chunk (state=3): >>><<< 22736 1727204252.12007: done transferring module to remote 22736 1727204252.12009: _low_level_execute_command(): starting 22736 1727204252.12012: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204251.5742528-23516-108513614984878/ /root/.ansible/tmp/ansible-tmp-1727204251.5742528-23516-108513614984878/AnsiballZ_package_facts.py && sleep 0' 22736 1727204252.12681: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204252.12698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204252.12777: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204252.12853: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204252.12898: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204252.12973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204252.15045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204252.15061: stderr chunk (state=3): >>><<< 22736 1727204252.15083: stdout chunk (state=3): >>><<< 22736 1727204252.15116: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204252.15127: _low_level_execute_command(): starting 22736 1727204252.15139: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204251.5742528-23516-108513614984878/AnsiballZ_package_facts.py && sleep 0' 22736 1727204252.15928: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204252.16053: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204252.16102: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204252.16149: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204252.80732: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "a<<< 22736 1727204252.80761: stdout chunk (state=3): >>>rch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 22736 1727204252.80772: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "relea<<< 22736 1727204252.80839: stdout chunk (state=3): >>>se": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 22736 1727204252.80850: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils",<<< 22736 1727204252.80855: stdout chunk (state=3): >>> "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 22736 1727204252.80876: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.<<< 22736 1727204252.80942: stdout chunk (state=3): >>>fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": <<< 22736 1727204252.80964: stdout chunk (state=3): >>>"perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 22736 1727204252.81007: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 22736 1727204252.83014: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204252.83048: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 22736 1727204252.83050: stdout chunk (state=3): >>><<< 22736 1727204252.83053: stderr chunk (state=3): >>><<< 22736 1727204252.83306: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204252.91714: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204251.5742528-23516-108513614984878/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204252.91749: _low_level_execute_command(): starting 22736 1727204252.91759: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204251.5742528-23516-108513614984878/ > /dev/null 2>&1 && sleep 0' 22736 1727204252.92437: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204252.92452: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204252.92474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204252.92596: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22736 1727204252.92618: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204252.92639: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204252.92724: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204252.94780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204252.94795: stdout chunk (state=3): >>><<< 22736 1727204252.94808: stderr chunk (state=3): >>><<< 22736 1727204252.94830: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204252.94843: handler run complete 22736 1727204252.96291: variable 'ansible_facts' from source: unknown 22736 1727204252.97010: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204253.00760: variable 'ansible_facts' from source: unknown 22736 1727204253.01797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204253.04321: attempt loop complete, returning result 22736 1727204253.04361: _execute() done 22736 1727204253.04369: dumping result to json 22736 1727204253.05195: done dumping result, returning 22736 1727204253.05199: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-4f4a-548a-00000000027f] 22736 1727204253.05201: sending task result for task 12b410aa-8751-4f4a-548a-00000000027f 22736 1727204253.09768: done sending task result for task 12b410aa-8751-4f4a-548a-00000000027f 22736 1727204253.09772: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22736 1727204253.09874: no more pending results, returning what we have 22736 1727204253.09878: results queue empty 22736 1727204253.09879: checking for any_errors_fatal 22736 1727204253.09883: done checking for any_errors_fatal 22736 1727204253.09884: checking for max_fail_percentage 22736 1727204253.09886: done checking for max_fail_percentage 22736 1727204253.09887: checking to see if all hosts have failed and the running result is not ok 22736 1727204253.09888: done checking to see if all hosts have failed 22736 1727204253.09893: getting the remaining hosts for this loop 22736 1727204253.09895: done getting the remaining hosts for this loop 22736 1727204253.09899: getting the next task for host managed-node2 22736 1727204253.09906: done getting next task for host managed-node2 22736 1727204253.09911: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 22736 1727204253.09913: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204253.09924: getting variables 22736 1727204253.09926: in VariableManager get_vars() 22736 1727204253.09962: Calling all_inventory to load vars for managed-node2 22736 1727204253.09966: Calling groups_inventory to load vars for managed-node2 22736 1727204253.09969: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204253.09980: Calling all_plugins_play to load vars for managed-node2 22736 1727204253.09984: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204253.09988: Calling groups_plugins_play to load vars for managed-node2 22736 1727204253.11953: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204253.14897: done with get_vars() 22736 1727204253.14934: done getting variables 22736 1727204253.15013: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:33 -0400 (0:00:01.636) 0:00:17.935 ***** 22736 1727204253.15051: entering _queue_task() for managed-node2/debug 22736 1727204253.15605: worker is 1 (out of 1 available) 22736 1727204253.15617: exiting _queue_task() for managed-node2/debug 22736 1727204253.15627: done queuing things up, now waiting for results queue to drain 22736 1727204253.15628: waiting for pending results... 22736 1727204253.15759: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 22736 1727204253.15872: in run() - task 12b410aa-8751-4f4a-548a-00000000001a 22736 1727204253.15964: variable 'ansible_search_path' from source: unknown 22736 1727204253.15969: variable 'ansible_search_path' from source: unknown 22736 1727204253.15973: calling self._execute() 22736 1727204253.16053: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204253.16072: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204253.16092: variable 'omit' from source: magic vars 22736 1727204253.16552: variable 'ansible_distribution_major_version' from source: facts 22736 1727204253.16571: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204253.16584: variable 'omit' from source: magic vars 22736 1727204253.16638: variable 'omit' from source: magic vars 22736 1727204253.16768: variable 'network_provider' from source: set_fact 22736 1727204253.16833: variable 'omit' from source: magic vars 22736 1727204253.16852: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204253.16904: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204253.16932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204253.16964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204253.16983: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204253.17050: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204253.17054: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204253.17057: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204253.17179: Set connection var ansible_timeout to 10 22736 1727204253.17202: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204253.17218: Set connection var ansible_shell_executable to /bin/sh 22736 1727204253.17267: Set connection var ansible_shell_type to sh 22736 1727204253.17271: Set connection var ansible_pipelining to False 22736 1727204253.17274: Set connection var ansible_connection to ssh 22736 1727204253.17277: variable 'ansible_shell_executable' from source: unknown 22736 1727204253.17285: variable 'ansible_connection' from source: unknown 22736 1727204253.17295: variable 'ansible_module_compression' from source: unknown 22736 1727204253.17303: variable 'ansible_shell_type' from source: unknown 22736 1727204253.17311: variable 'ansible_shell_executable' from source: unknown 22736 1727204253.17319: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204253.17328: variable 'ansible_pipelining' from source: unknown 22736 1727204253.17377: variable 'ansible_timeout' from source: unknown 22736 1727204253.17380: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204253.17530: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204253.17550: variable 'omit' from source: magic vars 22736 1727204253.17561: starting attempt loop 22736 1727204253.17569: running the handler 22736 1727204253.17630: handler run complete 22736 1727204253.17694: attempt loop complete, returning result 22736 1727204253.17697: _execute() done 22736 1727204253.17702: dumping result to json 22736 1727204253.17708: done dumping result, returning 22736 1727204253.17711: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-4f4a-548a-00000000001a] 22736 1727204253.17714: sending task result for task 12b410aa-8751-4f4a-548a-00000000001a 22736 1727204253.18014: done sending task result for task 12b410aa-8751-4f4a-548a-00000000001a 22736 1727204253.18019: WORKER PROCESS EXITING ok: [managed-node2] => {} MSG: Using network provider: nm 22736 1727204253.18083: no more pending results, returning what we have 22736 1727204253.18087: results queue empty 22736 1727204253.18088: checking for any_errors_fatal 22736 1727204253.18099: done checking for any_errors_fatal 22736 1727204253.18100: checking for max_fail_percentage 22736 1727204253.18102: done checking for max_fail_percentage 22736 1727204253.18103: checking to see if all hosts have failed and the running result is not ok 22736 1727204253.18104: done checking to see if all hosts have failed 22736 1727204253.18105: getting the remaining hosts for this loop 22736 1727204253.18107: done getting the remaining hosts for this loop 22736 1727204253.18112: getting the next task for host managed-node2 22736 1727204253.18118: done getting next task for host managed-node2 22736 1727204253.18123: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22736 1727204253.18125: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204253.18137: getting variables 22736 1727204253.18139: in VariableManager get_vars() 22736 1727204253.18183: Calling all_inventory to load vars for managed-node2 22736 1727204253.18187: Calling groups_inventory to load vars for managed-node2 22736 1727204253.18313: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204253.18326: Calling all_plugins_play to load vars for managed-node2 22736 1727204253.18330: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204253.18334: Calling groups_plugins_play to load vars for managed-node2 22736 1727204253.20497: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204253.23450: done with get_vars() 22736 1727204253.23508: done getting variables 22736 1727204253.23586: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.085) 0:00:18.021 ***** 22736 1727204253.23629: entering _queue_task() for managed-node2/fail 22736 1727204253.24099: worker is 1 (out of 1 available) 22736 1727204253.24113: exiting _queue_task() for managed-node2/fail 22736 1727204253.24126: done queuing things up, now waiting for results queue to drain 22736 1727204253.24127: waiting for pending results... 22736 1727204253.24333: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22736 1727204253.24479: in run() - task 12b410aa-8751-4f4a-548a-00000000001b 22736 1727204253.24505: variable 'ansible_search_path' from source: unknown 22736 1727204253.24515: variable 'ansible_search_path' from source: unknown 22736 1727204253.24565: calling self._execute() 22736 1727204253.24672: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204253.24694: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204253.24713: variable 'omit' from source: magic vars 22736 1727204253.25169: variable 'ansible_distribution_major_version' from source: facts 22736 1727204253.25191: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204253.25357: variable 'network_state' from source: role '' defaults 22736 1727204253.25396: Evaluated conditional (network_state != {}): False 22736 1727204253.25400: when evaluation is False, skipping this task 22736 1727204253.25403: _execute() done 22736 1727204253.25406: dumping result to json 22736 1727204253.25409: done dumping result, returning 22736 1727204253.25444: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-4f4a-548a-00000000001b] 22736 1727204253.25448: sending task result for task 12b410aa-8751-4f4a-548a-00000000001b 22736 1727204253.25626: done sending task result for task 12b410aa-8751-4f4a-548a-00000000001b 22736 1727204253.25631: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22736 1727204253.25691: no more pending results, returning what we have 22736 1727204253.25799: results queue empty 22736 1727204253.25800: checking for any_errors_fatal 22736 1727204253.25808: done checking for any_errors_fatal 22736 1727204253.25809: checking for max_fail_percentage 22736 1727204253.25811: done checking for max_fail_percentage 22736 1727204253.25812: checking to see if all hosts have failed and the running result is not ok 22736 1727204253.25813: done checking to see if all hosts have failed 22736 1727204253.25814: getting the remaining hosts for this loop 22736 1727204253.25816: done getting the remaining hosts for this loop 22736 1727204253.25821: getting the next task for host managed-node2 22736 1727204253.25828: done getting next task for host managed-node2 22736 1727204253.25833: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22736 1727204253.25837: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204253.25855: getting variables 22736 1727204253.25856: in VariableManager get_vars() 22736 1727204253.26004: Calling all_inventory to load vars for managed-node2 22736 1727204253.26009: Calling groups_inventory to load vars for managed-node2 22736 1727204253.26011: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204253.26023: Calling all_plugins_play to load vars for managed-node2 22736 1727204253.26026: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204253.26031: Calling groups_plugins_play to load vars for managed-node2 22736 1727204253.27488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204253.29725: done with get_vars() 22736 1727204253.29754: done getting variables 22736 1727204253.29810: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.062) 0:00:18.083 ***** 22736 1727204253.29838: entering _queue_task() for managed-node2/fail 22736 1727204253.30105: worker is 1 (out of 1 available) 22736 1727204253.30122: exiting _queue_task() for managed-node2/fail 22736 1727204253.30134: done queuing things up, now waiting for results queue to drain 22736 1727204253.30136: waiting for pending results... 22736 1727204253.30326: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22736 1727204253.30407: in run() - task 12b410aa-8751-4f4a-548a-00000000001c 22736 1727204253.30423: variable 'ansible_search_path' from source: unknown 22736 1727204253.30427: variable 'ansible_search_path' from source: unknown 22736 1727204253.30460: calling self._execute() 22736 1727204253.30547: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204253.30553: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204253.30563: variable 'omit' from source: magic vars 22736 1727204253.30893: variable 'ansible_distribution_major_version' from source: facts 22736 1727204253.30907: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204253.31011: variable 'network_state' from source: role '' defaults 22736 1727204253.31029: Evaluated conditional (network_state != {}): False 22736 1727204253.31032: when evaluation is False, skipping this task 22736 1727204253.31035: _execute() done 22736 1727204253.31038: dumping result to json 22736 1727204253.31040: done dumping result, returning 22736 1727204253.31047: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-4f4a-548a-00000000001c] 22736 1727204253.31052: sending task result for task 12b410aa-8751-4f4a-548a-00000000001c 22736 1727204253.31147: done sending task result for task 12b410aa-8751-4f4a-548a-00000000001c 22736 1727204253.31150: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22736 1727204253.31207: no more pending results, returning what we have 22736 1727204253.31211: results queue empty 22736 1727204253.31212: checking for any_errors_fatal 22736 1727204253.31221: done checking for any_errors_fatal 22736 1727204253.31222: checking for max_fail_percentage 22736 1727204253.31223: done checking for max_fail_percentage 22736 1727204253.31224: checking to see if all hosts have failed and the running result is not ok 22736 1727204253.31225: done checking to see if all hosts have failed 22736 1727204253.31226: getting the remaining hosts for this loop 22736 1727204253.31228: done getting the remaining hosts for this loop 22736 1727204253.31233: getting the next task for host managed-node2 22736 1727204253.31240: done getting next task for host managed-node2 22736 1727204253.31244: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22736 1727204253.31247: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204253.31264: getting variables 22736 1727204253.31266: in VariableManager get_vars() 22736 1727204253.31408: Calling all_inventory to load vars for managed-node2 22736 1727204253.31412: Calling groups_inventory to load vars for managed-node2 22736 1727204253.31415: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204253.31433: Calling all_plugins_play to load vars for managed-node2 22736 1727204253.31437: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204253.31440: Calling groups_plugins_play to load vars for managed-node2 22736 1727204253.33326: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204253.35379: done with get_vars() 22736 1727204253.35419: done getting variables 22736 1727204253.35504: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.056) 0:00:18.140 ***** 22736 1727204253.35539: entering _queue_task() for managed-node2/fail 22736 1727204253.35937: worker is 1 (out of 1 available) 22736 1727204253.35952: exiting _queue_task() for managed-node2/fail 22736 1727204253.35966: done queuing things up, now waiting for results queue to drain 22736 1727204253.35967: waiting for pending results... 22736 1727204253.36311: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22736 1727204253.36349: in run() - task 12b410aa-8751-4f4a-548a-00000000001d 22736 1727204253.36374: variable 'ansible_search_path' from source: unknown 22736 1727204253.36384: variable 'ansible_search_path' from source: unknown 22736 1727204253.36437: calling self._execute() 22736 1727204253.36544: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204253.36646: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204253.36650: variable 'omit' from source: magic vars 22736 1727204253.37033: variable 'ansible_distribution_major_version' from source: facts 22736 1727204253.37051: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204253.37284: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204253.39131: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204253.39197: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204253.39229: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204253.39260: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204253.39294: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204253.39595: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204253.39600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204253.39603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.39606: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204253.39608: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204253.39641: variable 'ansible_distribution_major_version' from source: facts 22736 1727204253.39673: Evaluated conditional (ansible_distribution_major_version | int > 9): True 22736 1727204253.39857: variable 'ansible_distribution' from source: facts 22736 1727204253.39868: variable '__network_rh_distros' from source: role '' defaults 22736 1727204253.39892: Evaluated conditional (ansible_distribution in __network_rh_distros): False 22736 1727204253.39906: when evaluation is False, skipping this task 22736 1727204253.39917: _execute() done 22736 1727204253.39989: dumping result to json 22736 1727204253.39993: done dumping result, returning 22736 1727204253.39998: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-4f4a-548a-00000000001d] 22736 1727204253.40001: sending task result for task 12b410aa-8751-4f4a-548a-00000000001d 22736 1727204253.40076: done sending task result for task 12b410aa-8751-4f4a-548a-00000000001d 22736 1727204253.40080: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 22736 1727204253.40143: no more pending results, returning what we have 22736 1727204253.40148: results queue empty 22736 1727204253.40149: checking for any_errors_fatal 22736 1727204253.40156: done checking for any_errors_fatal 22736 1727204253.40157: checking for max_fail_percentage 22736 1727204253.40159: done checking for max_fail_percentage 22736 1727204253.40160: checking to see if all hosts have failed and the running result is not ok 22736 1727204253.40161: done checking to see if all hosts have failed 22736 1727204253.40162: getting the remaining hosts for this loop 22736 1727204253.40163: done getting the remaining hosts for this loop 22736 1727204253.40168: getting the next task for host managed-node2 22736 1727204253.40176: done getting next task for host managed-node2 22736 1727204253.40181: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22736 1727204253.40184: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204253.40459: getting variables 22736 1727204253.40462: in VariableManager get_vars() 22736 1727204253.40549: Calling all_inventory to load vars for managed-node2 22736 1727204253.40553: Calling groups_inventory to load vars for managed-node2 22736 1727204253.40556: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204253.40566: Calling all_plugins_play to load vars for managed-node2 22736 1727204253.40569: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204253.40573: Calling groups_plugins_play to load vars for managed-node2 22736 1727204253.41821: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204253.43405: done with get_vars() 22736 1727204253.43437: done getting variables 22736 1727204253.43527: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.080) 0:00:18.220 ***** 22736 1727204253.43555: entering _queue_task() for managed-node2/dnf 22736 1727204253.43820: worker is 1 (out of 1 available) 22736 1727204253.43836: exiting _queue_task() for managed-node2/dnf 22736 1727204253.43849: done queuing things up, now waiting for results queue to drain 22736 1727204253.43851: waiting for pending results... 22736 1727204253.44036: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22736 1727204253.44116: in run() - task 12b410aa-8751-4f4a-548a-00000000001e 22736 1727204253.44129: variable 'ansible_search_path' from source: unknown 22736 1727204253.44133: variable 'ansible_search_path' from source: unknown 22736 1727204253.44165: calling self._execute() 22736 1727204253.44246: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204253.44253: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204253.44263: variable 'omit' from source: magic vars 22736 1727204253.44585: variable 'ansible_distribution_major_version' from source: facts 22736 1727204253.44598: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204253.44777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204253.46526: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204253.46868: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204253.46903: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204253.46939: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204253.46961: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204253.47033: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204253.47061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204253.47082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.47118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204253.47130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204253.47227: variable 'ansible_distribution' from source: facts 22736 1727204253.47231: variable 'ansible_distribution_major_version' from source: facts 22736 1727204253.47239: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 22736 1727204253.47336: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204253.47449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204253.47468: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204253.47497: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.47530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204253.47543: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204253.47580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204253.47605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204253.47626: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.47657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204253.47669: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204253.47708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204253.47730: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204253.47750: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.47780: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204253.47793: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204253.47925: variable 'network_connections' from source: play vars 22736 1727204253.47937: variable 'interface' from source: set_fact 22736 1727204253.47998: variable 'interface' from source: set_fact 22736 1727204253.48007: variable 'interface' from source: set_fact 22736 1727204253.48062: variable 'interface' from source: set_fact 22736 1727204253.48123: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204253.48259: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204253.48291: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204253.48318: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204253.48343: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204253.48384: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204253.48404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204253.48430: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.48451: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204253.48507: variable '__network_team_connections_defined' from source: role '' defaults 22736 1727204253.51659: variable 'network_connections' from source: play vars 22736 1727204253.51664: variable 'interface' from source: set_fact 22736 1727204253.51735: variable 'interface' from source: set_fact 22736 1727204253.51739: variable 'interface' from source: set_fact 22736 1727204253.51780: variable 'interface' from source: set_fact 22736 1727204253.51812: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22736 1727204253.51818: when evaluation is False, skipping this task 22736 1727204253.51821: _execute() done 22736 1727204253.51823: dumping result to json 22736 1727204253.51826: done dumping result, returning 22736 1727204253.51831: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-4f4a-548a-00000000001e] 22736 1727204253.51836: sending task result for task 12b410aa-8751-4f4a-548a-00000000001e 22736 1727204253.51931: done sending task result for task 12b410aa-8751-4f4a-548a-00000000001e 22736 1727204253.51934: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22736 1727204253.51996: no more pending results, returning what we have 22736 1727204253.52000: results queue empty 22736 1727204253.52001: checking for any_errors_fatal 22736 1727204253.52009: done checking for any_errors_fatal 22736 1727204253.52009: checking for max_fail_percentage 22736 1727204253.52011: done checking for max_fail_percentage 22736 1727204253.52012: checking to see if all hosts have failed and the running result is not ok 22736 1727204253.52015: done checking to see if all hosts have failed 22736 1727204253.52016: getting the remaining hosts for this loop 22736 1727204253.52018: done getting the remaining hosts for this loop 22736 1727204253.52022: getting the next task for host managed-node2 22736 1727204253.52028: done getting next task for host managed-node2 22736 1727204253.52033: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22736 1727204253.52035: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204253.52050: getting variables 22736 1727204253.52052: in VariableManager get_vars() 22736 1727204253.52093: Calling all_inventory to load vars for managed-node2 22736 1727204253.52096: Calling groups_inventory to load vars for managed-node2 22736 1727204253.52098: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204253.52109: Calling all_plugins_play to load vars for managed-node2 22736 1727204253.52114: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204253.52118: Calling groups_plugins_play to load vars for managed-node2 22736 1727204253.55907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204253.57465: done with get_vars() 22736 1727204253.57488: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22736 1727204253.57549: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.140) 0:00:18.360 ***** 22736 1727204253.57569: entering _queue_task() for managed-node2/yum 22736 1727204253.57570: Creating lock for yum 22736 1727204253.57845: worker is 1 (out of 1 available) 22736 1727204253.57860: exiting _queue_task() for managed-node2/yum 22736 1727204253.57873: done queuing things up, now waiting for results queue to drain 22736 1727204253.57874: waiting for pending results... 22736 1727204253.58062: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22736 1727204253.58149: in run() - task 12b410aa-8751-4f4a-548a-00000000001f 22736 1727204253.58162: variable 'ansible_search_path' from source: unknown 22736 1727204253.58165: variable 'ansible_search_path' from source: unknown 22736 1727204253.58204: calling self._execute() 22736 1727204253.58279: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204253.58285: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204253.58298: variable 'omit' from source: magic vars 22736 1727204253.58626: variable 'ansible_distribution_major_version' from source: facts 22736 1727204253.58638: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204253.58794: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204253.60576: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204253.60641: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204253.60673: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204253.60706: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204253.60732: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204253.60803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204253.60829: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204253.60854: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.60887: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204253.60901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204253.60993: variable 'ansible_distribution_major_version' from source: facts 22736 1727204253.61008: Evaluated conditional (ansible_distribution_major_version | int < 8): False 22736 1727204253.61011: when evaluation is False, skipping this task 22736 1727204253.61017: _execute() done 22736 1727204253.61020: dumping result to json 22736 1727204253.61023: done dumping result, returning 22736 1727204253.61031: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-4f4a-548a-00000000001f] 22736 1727204253.61034: sending task result for task 12b410aa-8751-4f4a-548a-00000000001f 22736 1727204253.61142: done sending task result for task 12b410aa-8751-4f4a-548a-00000000001f 22736 1727204253.61146: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 22736 1727204253.61220: no more pending results, returning what we have 22736 1727204253.61224: results queue empty 22736 1727204253.61225: checking for any_errors_fatal 22736 1727204253.61235: done checking for any_errors_fatal 22736 1727204253.61235: checking for max_fail_percentage 22736 1727204253.61237: done checking for max_fail_percentage 22736 1727204253.61239: checking to see if all hosts have failed and the running result is not ok 22736 1727204253.61240: done checking to see if all hosts have failed 22736 1727204253.61240: getting the remaining hosts for this loop 22736 1727204253.61242: done getting the remaining hosts for this loop 22736 1727204253.61247: getting the next task for host managed-node2 22736 1727204253.61253: done getting next task for host managed-node2 22736 1727204253.61257: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22736 1727204253.61259: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204253.61274: getting variables 22736 1727204253.61276: in VariableManager get_vars() 22736 1727204253.61326: Calling all_inventory to load vars for managed-node2 22736 1727204253.61329: Calling groups_inventory to load vars for managed-node2 22736 1727204253.61332: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204253.61342: Calling all_plugins_play to load vars for managed-node2 22736 1727204253.61345: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204253.61348: Calling groups_plugins_play to load vars for managed-node2 22736 1727204253.62596: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204253.64292: done with get_vars() 22736 1727204253.64316: done getting variables 22736 1727204253.64369: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.068) 0:00:18.428 ***** 22736 1727204253.64394: entering _queue_task() for managed-node2/fail 22736 1727204253.64637: worker is 1 (out of 1 available) 22736 1727204253.64652: exiting _queue_task() for managed-node2/fail 22736 1727204253.64665: done queuing things up, now waiting for results queue to drain 22736 1727204253.64667: waiting for pending results... 22736 1727204253.64848: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22736 1727204253.64936: in run() - task 12b410aa-8751-4f4a-548a-000000000020 22736 1727204253.64949: variable 'ansible_search_path' from source: unknown 22736 1727204253.64952: variable 'ansible_search_path' from source: unknown 22736 1727204253.64986: calling self._execute() 22736 1727204253.65072: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204253.65079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204253.65091: variable 'omit' from source: magic vars 22736 1727204253.65422: variable 'ansible_distribution_major_version' from source: facts 22736 1727204253.65433: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204253.65536: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204253.65709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204253.67460: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204253.67528: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204253.67557: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204253.67587: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204253.67610: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204253.67683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204253.67708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204253.67736: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.67769: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204253.67782: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204253.67826: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204253.67850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204253.67872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.67907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204253.67922: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204253.67961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204253.67982: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204253.68004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.68037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204253.68049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204253.68200: variable 'network_connections' from source: play vars 22736 1727204253.68214: variable 'interface' from source: set_fact 22736 1727204253.68280: variable 'interface' from source: set_fact 22736 1727204253.68284: variable 'interface' from source: set_fact 22736 1727204253.68342: variable 'interface' from source: set_fact 22736 1727204253.68409: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204253.68550: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204253.68583: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204253.68613: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204253.68644: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204253.68680: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204253.68701: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204253.68729: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.68751: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204253.68803: variable '__network_team_connections_defined' from source: role '' defaults 22736 1727204253.69013: variable 'network_connections' from source: play vars 22736 1727204253.69021: variable 'interface' from source: set_fact 22736 1727204253.69075: variable 'interface' from source: set_fact 22736 1727204253.69082: variable 'interface' from source: set_fact 22736 1727204253.69135: variable 'interface' from source: set_fact 22736 1727204253.69167: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22736 1727204253.69170: when evaluation is False, skipping this task 22736 1727204253.69173: _execute() done 22736 1727204253.69176: dumping result to json 22736 1727204253.69179: done dumping result, returning 22736 1727204253.69187: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-4f4a-548a-000000000020] 22736 1727204253.69199: sending task result for task 12b410aa-8751-4f4a-548a-000000000020 22736 1727204253.69289: done sending task result for task 12b410aa-8751-4f4a-548a-000000000020 22736 1727204253.69293: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22736 1727204253.69348: no more pending results, returning what we have 22736 1727204253.69352: results queue empty 22736 1727204253.69353: checking for any_errors_fatal 22736 1727204253.69361: done checking for any_errors_fatal 22736 1727204253.69362: checking for max_fail_percentage 22736 1727204253.69364: done checking for max_fail_percentage 22736 1727204253.69365: checking to see if all hosts have failed and the running result is not ok 22736 1727204253.69366: done checking to see if all hosts have failed 22736 1727204253.69367: getting the remaining hosts for this loop 22736 1727204253.69368: done getting the remaining hosts for this loop 22736 1727204253.69373: getting the next task for host managed-node2 22736 1727204253.69379: done getting next task for host managed-node2 22736 1727204253.69384: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 22736 1727204253.69386: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204253.69411: getting variables 22736 1727204253.69413: in VariableManager get_vars() 22736 1727204253.69452: Calling all_inventory to load vars for managed-node2 22736 1727204253.69455: Calling groups_inventory to load vars for managed-node2 22736 1727204253.69457: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204253.69467: Calling all_plugins_play to load vars for managed-node2 22736 1727204253.69470: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204253.69473: Calling groups_plugins_play to load vars for managed-node2 22736 1727204253.70703: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204253.72347: done with get_vars() 22736 1727204253.72370: done getting variables 22736 1727204253.72421: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.080) 0:00:18.509 ***** 22736 1727204253.72444: entering _queue_task() for managed-node2/package 22736 1727204253.72680: worker is 1 (out of 1 available) 22736 1727204253.72697: exiting _queue_task() for managed-node2/package 22736 1727204253.72710: done queuing things up, now waiting for results queue to drain 22736 1727204253.72711: waiting for pending results... 22736 1727204253.72894: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 22736 1727204253.72970: in run() - task 12b410aa-8751-4f4a-548a-000000000021 22736 1727204253.72983: variable 'ansible_search_path' from source: unknown 22736 1727204253.72986: variable 'ansible_search_path' from source: unknown 22736 1727204253.73022: calling self._execute() 22736 1727204253.73106: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204253.73113: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204253.73125: variable 'omit' from source: magic vars 22736 1727204253.73596: variable 'ansible_distribution_major_version' from source: facts 22736 1727204253.73599: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204253.73840: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204253.74187: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204253.74258: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204253.74305: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204253.74404: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204253.74564: variable 'network_packages' from source: role '' defaults 22736 1727204253.74658: variable '__network_provider_setup' from source: role '' defaults 22736 1727204253.74685: variable '__network_service_name_default_nm' from source: role '' defaults 22736 1727204253.74745: variable '__network_service_name_default_nm' from source: role '' defaults 22736 1727204253.74753: variable '__network_packages_default_nm' from source: role '' defaults 22736 1727204253.74810: variable '__network_packages_default_nm' from source: role '' defaults 22736 1727204253.74966: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204253.77296: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204253.77299: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204253.77302: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204253.77330: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204253.77363: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204253.77457: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204253.77492: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204253.77530: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.77587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204253.77612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204253.77672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204253.77708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204253.77745: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.77801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204253.77825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204253.78129: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22736 1727204253.78298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204253.78335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204253.78370: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.78428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204253.78449: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204253.78566: variable 'ansible_python' from source: facts 22736 1727204253.78602: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22736 1727204253.78712: variable '__network_wpa_supplicant_required' from source: role '' defaults 22736 1727204253.78824: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22736 1727204253.79094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204253.79097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204253.79100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.79118: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204253.79141: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204253.79207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204253.79253: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204253.79287: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.79348: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204253.79368: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204253.79561: variable 'network_connections' from source: play vars 22736 1727204253.79574: variable 'interface' from source: set_fact 22736 1727204253.79702: variable 'interface' from source: set_fact 22736 1727204253.79721: variable 'interface' from source: set_fact 22736 1727204253.79845: variable 'interface' from source: set_fact 22736 1727204253.79942: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204253.79979: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204253.80024: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204253.80102: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204253.80136: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204253.80518: variable 'network_connections' from source: play vars 22736 1727204253.80538: variable 'interface' from source: set_fact 22736 1727204253.80661: variable 'interface' from source: set_fact 22736 1727204253.80894: variable 'interface' from source: set_fact 22736 1727204253.80898: variable 'interface' from source: set_fact 22736 1727204253.80900: variable '__network_packages_default_wireless' from source: role '' defaults 22736 1727204253.80972: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204253.81393: variable 'network_connections' from source: play vars 22736 1727204253.81405: variable 'interface' from source: set_fact 22736 1727204253.81487: variable 'interface' from source: set_fact 22736 1727204253.81503: variable 'interface' from source: set_fact 22736 1727204253.81585: variable 'interface' from source: set_fact 22736 1727204253.81624: variable '__network_packages_default_team' from source: role '' defaults 22736 1727204253.81734: variable '__network_team_connections_defined' from source: role '' defaults 22736 1727204253.82143: variable 'network_connections' from source: play vars 22736 1727204253.82157: variable 'interface' from source: set_fact 22736 1727204253.82246: variable 'interface' from source: set_fact 22736 1727204253.82259: variable 'interface' from source: set_fact 22736 1727204253.82345: variable 'interface' from source: set_fact 22736 1727204253.82429: variable '__network_service_name_default_initscripts' from source: role '' defaults 22736 1727204253.82518: variable '__network_service_name_default_initscripts' from source: role '' defaults 22736 1727204253.82531: variable '__network_packages_default_initscripts' from source: role '' defaults 22736 1727204253.82610: variable '__network_packages_default_initscripts' from source: role '' defaults 22736 1727204253.82922: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22736 1727204253.83534: variable 'network_connections' from source: play vars 22736 1727204253.83545: variable 'interface' from source: set_fact 22736 1727204253.83619: variable 'interface' from source: set_fact 22736 1727204253.83635: variable 'interface' from source: set_fact 22736 1727204253.83718: variable 'interface' from source: set_fact 22736 1727204253.83737: variable 'ansible_distribution' from source: facts 22736 1727204253.83747: variable '__network_rh_distros' from source: role '' defaults 22736 1727204253.83758: variable 'ansible_distribution_major_version' from source: facts 22736 1727204253.83794: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22736 1727204253.84097: variable 'ansible_distribution' from source: facts 22736 1727204253.84100: variable '__network_rh_distros' from source: role '' defaults 22736 1727204253.84103: variable 'ansible_distribution_major_version' from source: facts 22736 1727204253.84105: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22736 1727204253.84327: variable 'ansible_distribution' from source: facts 22736 1727204253.84338: variable '__network_rh_distros' from source: role '' defaults 22736 1727204253.84350: variable 'ansible_distribution_major_version' from source: facts 22736 1727204253.84422: variable 'network_provider' from source: set_fact 22736 1727204253.84448: variable 'ansible_facts' from source: unknown 22736 1727204253.85620: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 22736 1727204253.85629: when evaluation is False, skipping this task 22736 1727204253.85638: _execute() done 22736 1727204253.85646: dumping result to json 22736 1727204253.85654: done dumping result, returning 22736 1727204253.85667: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-4f4a-548a-000000000021] 22736 1727204253.85677: sending task result for task 12b410aa-8751-4f4a-548a-000000000021 skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 22736 1727204253.85867: no more pending results, returning what we have 22736 1727204253.85871: results queue empty 22736 1727204253.85872: checking for any_errors_fatal 22736 1727204253.85884: done checking for any_errors_fatal 22736 1727204253.85885: checking for max_fail_percentage 22736 1727204253.85887: done checking for max_fail_percentage 22736 1727204253.85888: checking to see if all hosts have failed and the running result is not ok 22736 1727204253.85890: done checking to see if all hosts have failed 22736 1727204253.85891: getting the remaining hosts for this loop 22736 1727204253.85893: done getting the remaining hosts for this loop 22736 1727204253.85897: getting the next task for host managed-node2 22736 1727204253.85904: done getting next task for host managed-node2 22736 1727204253.85908: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22736 1727204253.85994: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204253.86015: getting variables 22736 1727204253.86017: in VariableManager get_vars() 22736 1727204253.86066: Calling all_inventory to load vars for managed-node2 22736 1727204253.86069: Calling groups_inventory to load vars for managed-node2 22736 1727204253.86071: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204253.86083: Calling all_plugins_play to load vars for managed-node2 22736 1727204253.86093: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204253.86097: Calling groups_plugins_play to load vars for managed-node2 22736 1727204253.86620: done sending task result for task 12b410aa-8751-4f4a-548a-000000000021 22736 1727204253.86624: WORKER PROCESS EXITING 22736 1727204253.88748: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204253.91755: done with get_vars() 22736 1727204253.91799: done getting variables 22736 1727204253.91873: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:33 -0400 (0:00:00.194) 0:00:18.704 ***** 22736 1727204253.91916: entering _queue_task() for managed-node2/package 22736 1727204253.92276: worker is 1 (out of 1 available) 22736 1727204253.92396: exiting _queue_task() for managed-node2/package 22736 1727204253.92409: done queuing things up, now waiting for results queue to drain 22736 1727204253.92410: waiting for pending results... 22736 1727204253.92619: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22736 1727204253.92754: in run() - task 12b410aa-8751-4f4a-548a-000000000022 22736 1727204253.92777: variable 'ansible_search_path' from source: unknown 22736 1727204253.92787: variable 'ansible_search_path' from source: unknown 22736 1727204253.92839: calling self._execute() 22736 1727204253.92959: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204253.92976: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204253.92993: variable 'omit' from source: magic vars 22736 1727204253.93456: variable 'ansible_distribution_major_version' from source: facts 22736 1727204253.93477: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204253.93681: variable 'network_state' from source: role '' defaults 22736 1727204253.93702: Evaluated conditional (network_state != {}): False 22736 1727204253.93715: when evaluation is False, skipping this task 22736 1727204253.93782: _execute() done 22736 1727204253.93785: dumping result to json 22736 1727204253.93790: done dumping result, returning 22736 1727204253.93794: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-4f4a-548a-000000000022] 22736 1727204253.93797: sending task result for task 12b410aa-8751-4f4a-548a-000000000022 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22736 1727204253.93951: no more pending results, returning what we have 22736 1727204253.93955: results queue empty 22736 1727204253.93957: checking for any_errors_fatal 22736 1727204253.93966: done checking for any_errors_fatal 22736 1727204253.93967: checking for max_fail_percentage 22736 1727204253.93969: done checking for max_fail_percentage 22736 1727204253.93971: checking to see if all hosts have failed and the running result is not ok 22736 1727204253.93972: done checking to see if all hosts have failed 22736 1727204253.93973: getting the remaining hosts for this loop 22736 1727204253.93975: done getting the remaining hosts for this loop 22736 1727204253.93980: getting the next task for host managed-node2 22736 1727204253.93988: done getting next task for host managed-node2 22736 1727204253.93995: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22736 1727204253.93998: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204253.94019: getting variables 22736 1727204253.94022: in VariableManager get_vars() 22736 1727204253.94068: Calling all_inventory to load vars for managed-node2 22736 1727204253.94071: Calling groups_inventory to load vars for managed-node2 22736 1727204253.94075: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204253.94395: Calling all_plugins_play to load vars for managed-node2 22736 1727204253.94400: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204253.94407: done sending task result for task 12b410aa-8751-4f4a-548a-000000000022 22736 1727204253.94410: WORKER PROCESS EXITING 22736 1727204253.94419: Calling groups_plugins_play to load vars for managed-node2 22736 1727204253.96945: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204254.00155: done with get_vars() 22736 1727204254.00192: done getting variables 22736 1727204254.00264: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:34 -0400 (0:00:00.083) 0:00:18.787 ***** 22736 1727204254.00304: entering _queue_task() for managed-node2/package 22736 1727204254.00663: worker is 1 (out of 1 available) 22736 1727204254.00677: exiting _queue_task() for managed-node2/package 22736 1727204254.00795: done queuing things up, now waiting for results queue to drain 22736 1727204254.00797: waiting for pending results... 22736 1727204254.01005: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22736 1727204254.01129: in run() - task 12b410aa-8751-4f4a-548a-000000000023 22736 1727204254.01154: variable 'ansible_search_path' from source: unknown 22736 1727204254.01163: variable 'ansible_search_path' from source: unknown 22736 1727204254.01208: calling self._execute() 22736 1727204254.01328: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204254.01343: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204254.01364: variable 'omit' from source: magic vars 22736 1727204254.01821: variable 'ansible_distribution_major_version' from source: facts 22736 1727204254.01839: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204254.02004: variable 'network_state' from source: role '' defaults 22736 1727204254.02026: Evaluated conditional (network_state != {}): False 22736 1727204254.02034: when evaluation is False, skipping this task 22736 1727204254.02042: _execute() done 22736 1727204254.02050: dumping result to json 22736 1727204254.02058: done dumping result, returning 22736 1727204254.02070: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-4f4a-548a-000000000023] 22736 1727204254.02080: sending task result for task 12b410aa-8751-4f4a-548a-000000000023 skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22736 1727204254.02347: no more pending results, returning what we have 22736 1727204254.02352: results queue empty 22736 1727204254.02353: checking for any_errors_fatal 22736 1727204254.02363: done checking for any_errors_fatal 22736 1727204254.02364: checking for max_fail_percentage 22736 1727204254.02366: done checking for max_fail_percentage 22736 1727204254.02367: checking to see if all hosts have failed and the running result is not ok 22736 1727204254.02369: done checking to see if all hosts have failed 22736 1727204254.02370: getting the remaining hosts for this loop 22736 1727204254.02371: done getting the remaining hosts for this loop 22736 1727204254.02376: getting the next task for host managed-node2 22736 1727204254.02384: done getting next task for host managed-node2 22736 1727204254.02388: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22736 1727204254.02393: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204254.02411: getting variables 22736 1727204254.02415: in VariableManager get_vars() 22736 1727204254.02458: Calling all_inventory to load vars for managed-node2 22736 1727204254.02462: Calling groups_inventory to load vars for managed-node2 22736 1727204254.02465: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204254.02480: Calling all_plugins_play to load vars for managed-node2 22736 1727204254.02484: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204254.02488: Calling groups_plugins_play to load vars for managed-node2 22736 1727204254.03406: done sending task result for task 12b410aa-8751-4f4a-548a-000000000023 22736 1727204254.03409: WORKER PROCESS EXITING 22736 1727204254.04915: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204254.08461: done with get_vars() 22736 1727204254.08511: done getting variables 22736 1727204254.08633: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:34 -0400 (0:00:00.083) 0:00:18.871 ***** 22736 1727204254.08669: entering _queue_task() for managed-node2/service 22736 1727204254.08671: Creating lock for service 22736 1727204254.09051: worker is 1 (out of 1 available) 22736 1727204254.09066: exiting _queue_task() for managed-node2/service 22736 1727204254.09081: done queuing things up, now waiting for results queue to drain 22736 1727204254.09083: waiting for pending results... 22736 1727204254.09636: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22736 1727204254.09975: in run() - task 12b410aa-8751-4f4a-548a-000000000024 22736 1727204254.09995: variable 'ansible_search_path' from source: unknown 22736 1727204254.09999: variable 'ansible_search_path' from source: unknown 22736 1727204254.10038: calling self._execute() 22736 1727204254.10263: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204254.10271: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204254.10284: variable 'omit' from source: magic vars 22736 1727204254.11297: variable 'ansible_distribution_major_version' from source: facts 22736 1727204254.11312: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204254.11635: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204254.12243: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204254.15321: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204254.15802: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204254.15848: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204254.15888: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204254.15925: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204254.16025: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204254.16075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204254.16294: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204254.16298: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204254.16301: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204254.16303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204254.16305: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204254.16308: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204254.16337: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204254.16355: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204254.16408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204254.16438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204254.16472: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204254.16522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204254.16539: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204254.16773: variable 'network_connections' from source: play vars 22736 1727204254.16793: variable 'interface' from source: set_fact 22736 1727204254.16872: variable 'interface' from source: set_fact 22736 1727204254.16887: variable 'interface' from source: set_fact 22736 1727204254.16954: variable 'interface' from source: set_fact 22736 1727204254.17038: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204254.17269: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204254.17322: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204254.17362: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204254.17415: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204254.17471: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204254.17531: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204254.17535: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204254.17598: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204254.17639: variable '__network_team_connections_defined' from source: role '' defaults 22736 1727204254.17977: variable 'network_connections' from source: play vars 22736 1727204254.17983: variable 'interface' from source: set_fact 22736 1727204254.18061: variable 'interface' from source: set_fact 22736 1727204254.18065: variable 'interface' from source: set_fact 22736 1727204254.18144: variable 'interface' from source: set_fact 22736 1727204254.18251: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22736 1727204254.18255: when evaluation is False, skipping this task 22736 1727204254.18257: _execute() done 22736 1727204254.18260: dumping result to json 22736 1727204254.18262: done dumping result, returning 22736 1727204254.18264: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-4f4a-548a-000000000024] 22736 1727204254.18278: sending task result for task 12b410aa-8751-4f4a-548a-000000000024 22736 1727204254.18349: done sending task result for task 12b410aa-8751-4f4a-548a-000000000024 22736 1727204254.18463: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22736 1727204254.18515: no more pending results, returning what we have 22736 1727204254.18519: results queue empty 22736 1727204254.18520: checking for any_errors_fatal 22736 1727204254.18527: done checking for any_errors_fatal 22736 1727204254.18528: checking for max_fail_percentage 22736 1727204254.18530: done checking for max_fail_percentage 22736 1727204254.18531: checking to see if all hosts have failed and the running result is not ok 22736 1727204254.18532: done checking to see if all hosts have failed 22736 1727204254.18533: getting the remaining hosts for this loop 22736 1727204254.18534: done getting the remaining hosts for this loop 22736 1727204254.18538: getting the next task for host managed-node2 22736 1727204254.18544: done getting next task for host managed-node2 22736 1727204254.18549: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22736 1727204254.18551: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204254.18564: getting variables 22736 1727204254.18566: in VariableManager get_vars() 22736 1727204254.18605: Calling all_inventory to load vars for managed-node2 22736 1727204254.18609: Calling groups_inventory to load vars for managed-node2 22736 1727204254.18611: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204254.18625: Calling all_plugins_play to load vars for managed-node2 22736 1727204254.18629: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204254.18633: Calling groups_plugins_play to load vars for managed-node2 22736 1727204254.21526: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204254.24588: done with get_vars() 22736 1727204254.24632: done getting variables 22736 1727204254.24711: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:34 -0400 (0:00:00.160) 0:00:19.032 ***** 22736 1727204254.24745: entering _queue_task() for managed-node2/service 22736 1727204254.25229: worker is 1 (out of 1 available) 22736 1727204254.25245: exiting _queue_task() for managed-node2/service 22736 1727204254.25258: done queuing things up, now waiting for results queue to drain 22736 1727204254.25259: waiting for pending results... 22736 1727204254.25468: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22736 1727204254.25601: in run() - task 12b410aa-8751-4f4a-548a-000000000025 22736 1727204254.25624: variable 'ansible_search_path' from source: unknown 22736 1727204254.25633: variable 'ansible_search_path' from source: unknown 22736 1727204254.25682: calling self._execute() 22736 1727204254.25803: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204254.25819: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204254.25838: variable 'omit' from source: magic vars 22736 1727204254.26309: variable 'ansible_distribution_major_version' from source: facts 22736 1727204254.26333: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204254.26529: variable 'network_provider' from source: set_fact 22736 1727204254.26545: variable 'network_state' from source: role '' defaults 22736 1727204254.26595: Evaluated conditional (network_provider == "nm" or network_state != {}): True 22736 1727204254.26599: variable 'omit' from source: magic vars 22736 1727204254.26634: variable 'omit' from source: magic vars 22736 1727204254.26681: variable 'network_service_name' from source: role '' defaults 22736 1727204254.26788: variable 'network_service_name' from source: role '' defaults 22736 1727204254.26959: variable '__network_provider_setup' from source: role '' defaults 22736 1727204254.26962: variable '__network_service_name_default_nm' from source: role '' defaults 22736 1727204254.27043: variable '__network_service_name_default_nm' from source: role '' defaults 22736 1727204254.27059: variable '__network_packages_default_nm' from source: role '' defaults 22736 1727204254.27177: variable '__network_packages_default_nm' from source: role '' defaults 22736 1727204254.27476: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204254.30193: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204254.30301: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204254.30395: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204254.30408: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204254.30445: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204254.30601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204254.30605: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204254.30634: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204254.30695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204254.30728: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204254.30795: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204254.30839: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204254.30879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204254.30945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204254.30967: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204254.31299: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22736 1727204254.31471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204254.31528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204254.31553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204254.31590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204254.31602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204254.31688: variable 'ansible_python' from source: facts 22736 1727204254.31714: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22736 1727204254.31787: variable '__network_wpa_supplicant_required' from source: role '' defaults 22736 1727204254.31860: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22736 1727204254.31971: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204254.31993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204254.32014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204254.32051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204254.32063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204254.32107: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204254.32135: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204254.32155: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204254.32188: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204254.32202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204254.32322: variable 'network_connections' from source: play vars 22736 1727204254.32331: variable 'interface' from source: set_fact 22736 1727204254.32397: variable 'interface' from source: set_fact 22736 1727204254.32407: variable 'interface' from source: set_fact 22736 1727204254.32470: variable 'interface' from source: set_fact 22736 1727204254.32562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204254.32734: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204254.32777: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204254.32820: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204254.32858: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204254.32914: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204254.32942: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204254.32969: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204254.33003: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204254.33048: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204254.33300: variable 'network_connections' from source: play vars 22736 1727204254.33307: variable 'interface' from source: set_fact 22736 1727204254.33472: variable 'interface' from source: set_fact 22736 1727204254.33475: variable 'interface' from source: set_fact 22736 1727204254.33602: variable 'interface' from source: set_fact 22736 1727204254.33652: variable '__network_packages_default_wireless' from source: role '' defaults 22736 1727204254.33770: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204254.34222: variable 'network_connections' from source: play vars 22736 1727204254.34268: variable 'interface' from source: set_fact 22736 1727204254.34381: variable 'interface' from source: set_fact 22736 1727204254.34384: variable 'interface' from source: set_fact 22736 1727204254.34471: variable 'interface' from source: set_fact 22736 1727204254.34500: variable '__network_packages_default_team' from source: role '' defaults 22736 1727204254.34571: variable '__network_team_connections_defined' from source: role '' defaults 22736 1727204254.34828: variable 'network_connections' from source: play vars 22736 1727204254.34831: variable 'interface' from source: set_fact 22736 1727204254.34893: variable 'interface' from source: set_fact 22736 1727204254.34900: variable 'interface' from source: set_fact 22736 1727204254.34960: variable 'interface' from source: set_fact 22736 1727204254.35015: variable '__network_service_name_default_initscripts' from source: role '' defaults 22736 1727204254.35065: variable '__network_service_name_default_initscripts' from source: role '' defaults 22736 1727204254.35072: variable '__network_packages_default_initscripts' from source: role '' defaults 22736 1727204254.35124: variable '__network_packages_default_initscripts' from source: role '' defaults 22736 1727204254.35310: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22736 1727204254.35725: variable 'network_connections' from source: play vars 22736 1727204254.35728: variable 'interface' from source: set_fact 22736 1727204254.35780: variable 'interface' from source: set_fact 22736 1727204254.35787: variable 'interface' from source: set_fact 22736 1727204254.35847: variable 'interface' from source: set_fact 22736 1727204254.35851: variable 'ansible_distribution' from source: facts 22736 1727204254.35857: variable '__network_rh_distros' from source: role '' defaults 22736 1727204254.35863: variable 'ansible_distribution_major_version' from source: facts 22736 1727204254.35883: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22736 1727204254.36031: variable 'ansible_distribution' from source: facts 22736 1727204254.36036: variable '__network_rh_distros' from source: role '' defaults 22736 1727204254.36042: variable 'ansible_distribution_major_version' from source: facts 22736 1727204254.36049: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22736 1727204254.36198: variable 'ansible_distribution' from source: facts 22736 1727204254.36202: variable '__network_rh_distros' from source: role '' defaults 22736 1727204254.36208: variable 'ansible_distribution_major_version' from source: facts 22736 1727204254.36238: variable 'network_provider' from source: set_fact 22736 1727204254.36260: variable 'omit' from source: magic vars 22736 1727204254.36286: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204254.36315: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204254.36330: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204254.36346: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204254.36358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204254.36391: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204254.36396: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204254.36399: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204254.36503: Set connection var ansible_timeout to 10 22736 1727204254.36506: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204254.36509: Set connection var ansible_shell_executable to /bin/sh 22736 1727204254.36514: Set connection var ansible_shell_type to sh 22736 1727204254.36531: Set connection var ansible_pipelining to False 22736 1727204254.36534: Set connection var ansible_connection to ssh 22736 1727204254.36548: variable 'ansible_shell_executable' from source: unknown 22736 1727204254.36551: variable 'ansible_connection' from source: unknown 22736 1727204254.36554: variable 'ansible_module_compression' from source: unknown 22736 1727204254.36559: variable 'ansible_shell_type' from source: unknown 22736 1727204254.36561: variable 'ansible_shell_executable' from source: unknown 22736 1727204254.36566: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204254.36589: variable 'ansible_pipelining' from source: unknown 22736 1727204254.36593: variable 'ansible_timeout' from source: unknown 22736 1727204254.36596: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204254.36902: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204254.36906: variable 'omit' from source: magic vars 22736 1727204254.36908: starting attempt loop 22736 1727204254.36911: running the handler 22736 1727204254.36913: variable 'ansible_facts' from source: unknown 22736 1727204254.38015: _low_level_execute_command(): starting 22736 1727204254.38025: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204254.38611: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204254.38629: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204254.38644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204254.38691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204254.38706: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204254.38757: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204254.40651: stdout chunk (state=3): >>>/root <<< 22736 1727204254.40680: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204254.40766: stderr chunk (state=3): >>><<< 22736 1727204254.40770: stdout chunk (state=3): >>><<< 22736 1727204254.40879: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204254.40882: _low_level_execute_command(): starting 22736 1727204254.40886: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204254.4080381-23582-153256054895870 `" && echo ansible-tmp-1727204254.4080381-23582-153256054895870="` echo /root/.ansible/tmp/ansible-tmp-1727204254.4080381-23582-153256054895870 `" ) && sleep 0' 22736 1727204254.41673: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204254.41758: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204254.41801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204254.43939: stdout chunk (state=3): >>>ansible-tmp-1727204254.4080381-23582-153256054895870=/root/.ansible/tmp/ansible-tmp-1727204254.4080381-23582-153256054895870 <<< 22736 1727204254.44110: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204254.44159: stderr chunk (state=3): >>><<< 22736 1727204254.44177: stdout chunk (state=3): >>><<< 22736 1727204254.44232: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204254.4080381-23582-153256054895870=/root/.ansible/tmp/ansible-tmp-1727204254.4080381-23582-153256054895870 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204254.44254: variable 'ansible_module_compression' from source: unknown 22736 1727204254.44325: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 22736 1727204254.44349: ANSIBALLZ: Acquiring lock 22736 1727204254.44406: ANSIBALLZ: Lock acquired: 140553536881728 22736 1727204254.44409: ANSIBALLZ: Creating module 22736 1727204254.71661: ANSIBALLZ: Writing module into payload 22736 1727204254.71892: ANSIBALLZ: Writing module 22736 1727204254.71938: ANSIBALLZ: Renaming module 22736 1727204254.71995: ANSIBALLZ: Done creating module 22736 1727204254.72001: variable 'ansible_facts' from source: unknown 22736 1727204254.72241: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204254.4080381-23582-153256054895870/AnsiballZ_systemd.py 22736 1727204254.72627: Sending initial data 22736 1727204254.72635: Sent initial data (156 bytes) 22736 1727204254.73120: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204254.73124: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204254.73127: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204254.73130: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204254.73133: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204254.73188: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204254.73203: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204254.73230: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204254.73313: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204254.75100: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204254.75156: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204254.75219: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmprk8wesdz /root/.ansible/tmp/ansible-tmp-1727204254.4080381-23582-153256054895870/AnsiballZ_systemd.py <<< 22736 1727204254.75243: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204254.4080381-23582-153256054895870/AnsiballZ_systemd.py" <<< 22736 1727204254.75257: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmprk8wesdz" to remote "/root/.ansible/tmp/ansible-tmp-1727204254.4080381-23582-153256054895870/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204254.4080381-23582-153256054895870/AnsiballZ_systemd.py" <<< 22736 1727204254.77714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204254.77809: stderr chunk (state=3): >>><<< 22736 1727204254.77822: stdout chunk (state=3): >>><<< 22736 1727204254.77865: done transferring module to remote 22736 1727204254.77884: _low_level_execute_command(): starting 22736 1727204254.77898: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204254.4080381-23582-153256054895870/ /root/.ansible/tmp/ansible-tmp-1727204254.4080381-23582-153256054895870/AnsiballZ_systemd.py && sleep 0' 22736 1727204254.78702: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204254.78803: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204254.78846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204254.78877: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204254.78984: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204254.89727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204254.89896: stderr chunk (state=3): >>><<< 22736 1727204254.89900: stdout chunk (state=3): >>><<< 22736 1727204254.89904: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204254.89907: _low_level_execute_command(): starting 22736 1727204254.89909: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204254.4080381-23582-153256054895870/AnsiballZ_systemd.py && sleep 0' 22736 1727204254.90668: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204254.90707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204254.90801: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204255.24519: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4431872", "MemoryAvailable": "infinity", "CPUUsageNSec": "1283186000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 22736 1727204255.26625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204255.26691: stderr chunk (state=3): >>><<< 22736 1727204255.26695: stdout chunk (state=3): >>><<< 22736 1727204255.26718: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4431872", "MemoryAvailable": "infinity", "CPUUsageNSec": "1283186000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204255.26891: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204254.4080381-23582-153256054895870/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204255.26908: _low_level_execute_command(): starting 22736 1727204255.26918: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204254.4080381-23582-153256054895870/ > /dev/null 2>&1 && sleep 0' 22736 1727204255.27798: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204255.27802: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204255.27807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204255.27810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204255.27812: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204255.27817: stderr chunk (state=3): >>>debug2: match not found <<< 22736 1727204255.27819: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204255.27821: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22736 1727204255.27824: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 22736 1727204255.27827: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22736 1727204255.27829: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204255.27830: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204255.27832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204255.27834: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204255.27836: stderr chunk (state=3): >>>debug2: match found <<< 22736 1727204255.27838: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204255.27839: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204255.27841: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204255.28021: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204255.29965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204255.29969: stdout chunk (state=3): >>><<< 22736 1727204255.29978: stderr chunk (state=3): >>><<< 22736 1727204255.30004: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204255.30015: handler run complete 22736 1727204255.30108: attempt loop complete, returning result 22736 1727204255.30112: _execute() done 22736 1727204255.30117: dumping result to json 22736 1727204255.30141: done dumping result, returning 22736 1727204255.30153: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-4f4a-548a-000000000025] 22736 1727204255.30158: sending task result for task 12b410aa-8751-4f4a-548a-000000000025 22736 1727204255.30741: done sending task result for task 12b410aa-8751-4f4a-548a-000000000025 22736 1727204255.30745: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22736 1727204255.30836: no more pending results, returning what we have 22736 1727204255.30840: results queue empty 22736 1727204255.30841: checking for any_errors_fatal 22736 1727204255.30848: done checking for any_errors_fatal 22736 1727204255.30849: checking for max_fail_percentage 22736 1727204255.30850: done checking for max_fail_percentage 22736 1727204255.30851: checking to see if all hosts have failed and the running result is not ok 22736 1727204255.30853: done checking to see if all hosts have failed 22736 1727204255.30854: getting the remaining hosts for this loop 22736 1727204255.30856: done getting the remaining hosts for this loop 22736 1727204255.30860: getting the next task for host managed-node2 22736 1727204255.30866: done getting next task for host managed-node2 22736 1727204255.30871: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22736 1727204255.30874: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204255.30886: getting variables 22736 1727204255.30888: in VariableManager get_vars() 22736 1727204255.31048: Calling all_inventory to load vars for managed-node2 22736 1727204255.31052: Calling groups_inventory to load vars for managed-node2 22736 1727204255.31055: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204255.31066: Calling all_plugins_play to load vars for managed-node2 22736 1727204255.31070: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204255.31074: Calling groups_plugins_play to load vars for managed-node2 22736 1727204255.33427: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204255.36382: done with get_vars() 22736 1727204255.36440: done getting variables 22736 1727204255.36521: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:35 -0400 (0:00:01.118) 0:00:20.150 ***** 22736 1727204255.36570: entering _queue_task() for managed-node2/service 22736 1727204255.36941: worker is 1 (out of 1 available) 22736 1727204255.36957: exiting _queue_task() for managed-node2/service 22736 1727204255.36970: done queuing things up, now waiting for results queue to drain 22736 1727204255.36972: waiting for pending results... 22736 1727204255.37288: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22736 1727204255.37429: in run() - task 12b410aa-8751-4f4a-548a-000000000026 22736 1727204255.37453: variable 'ansible_search_path' from source: unknown 22736 1727204255.37462: variable 'ansible_search_path' from source: unknown 22736 1727204255.37509: calling self._execute() 22736 1727204255.37631: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204255.37650: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204255.37666: variable 'omit' from source: magic vars 22736 1727204255.38138: variable 'ansible_distribution_major_version' from source: facts 22736 1727204255.38182: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204255.38332: variable 'network_provider' from source: set_fact 22736 1727204255.38343: Evaluated conditional (network_provider == "nm"): True 22736 1727204255.38474: variable '__network_wpa_supplicant_required' from source: role '' defaults 22736 1727204255.38618: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22736 1727204255.38848: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204255.41697: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204255.41701: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204255.41704: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204255.41706: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204255.41730: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204255.41848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204255.41894: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204255.41939: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204255.41999: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204255.42022: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204255.42094: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204255.42129: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204255.42172: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204255.42229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204255.42250: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204255.42316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204255.42350: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204255.42478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204255.42481: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204255.42484: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204255.42665: variable 'network_connections' from source: play vars 22736 1727204255.42686: variable 'interface' from source: set_fact 22736 1727204255.42793: variable 'interface' from source: set_fact 22736 1727204255.42818: variable 'interface' from source: set_fact 22736 1727204255.42899: variable 'interface' from source: set_fact 22736 1727204255.43003: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204255.43242: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204255.43305: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204255.43358: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204255.43405: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204255.43469: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204255.43570: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204255.43573: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204255.43577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204255.43644: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204255.44095: variable 'network_connections' from source: play vars 22736 1727204255.44098: variable 'interface' from source: set_fact 22736 1727204255.44128: variable 'interface' from source: set_fact 22736 1727204255.44140: variable 'interface' from source: set_fact 22736 1727204255.44225: variable 'interface' from source: set_fact 22736 1727204255.44283: Evaluated conditional (__network_wpa_supplicant_required): False 22736 1727204255.44294: when evaluation is False, skipping this task 22736 1727204255.44302: _execute() done 22736 1727204255.44318: dumping result to json 22736 1727204255.44331: done dumping result, returning 22736 1727204255.44349: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-4f4a-548a-000000000026] 22736 1727204255.44358: sending task result for task 12b410aa-8751-4f4a-548a-000000000026 22736 1727204255.44585: done sending task result for task 12b410aa-8751-4f4a-548a-000000000026 22736 1727204255.44590: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 22736 1727204255.44646: no more pending results, returning what we have 22736 1727204255.44650: results queue empty 22736 1727204255.44651: checking for any_errors_fatal 22736 1727204255.44683: done checking for any_errors_fatal 22736 1727204255.44684: checking for max_fail_percentage 22736 1727204255.44686: done checking for max_fail_percentage 22736 1727204255.44687: checking to see if all hosts have failed and the running result is not ok 22736 1727204255.44688: done checking to see if all hosts have failed 22736 1727204255.44691: getting the remaining hosts for this loop 22736 1727204255.44693: done getting the remaining hosts for this loop 22736 1727204255.44698: getting the next task for host managed-node2 22736 1727204255.44896: done getting next task for host managed-node2 22736 1727204255.44902: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 22736 1727204255.44904: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204255.44920: getting variables 22736 1727204255.44922: in VariableManager get_vars() 22736 1727204255.44963: Calling all_inventory to load vars for managed-node2 22736 1727204255.44967: Calling groups_inventory to load vars for managed-node2 22736 1727204255.44969: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204255.44980: Calling all_plugins_play to load vars for managed-node2 22736 1727204255.44984: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204255.44988: Calling groups_plugins_play to load vars for managed-node2 22736 1727204255.47278: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204255.50352: done with get_vars() 22736 1727204255.50403: done getting variables 22736 1727204255.50479: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:35 -0400 (0:00:00.139) 0:00:20.290 ***** 22736 1727204255.50522: entering _queue_task() for managed-node2/service 22736 1727204255.51123: worker is 1 (out of 1 available) 22736 1727204255.51134: exiting _queue_task() for managed-node2/service 22736 1727204255.51145: done queuing things up, now waiting for results queue to drain 22736 1727204255.51147: waiting for pending results... 22736 1727204255.51391: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 22736 1727204255.51396: in run() - task 12b410aa-8751-4f4a-548a-000000000027 22736 1727204255.51399: variable 'ansible_search_path' from source: unknown 22736 1727204255.51402: variable 'ansible_search_path' from source: unknown 22736 1727204255.51435: calling self._execute() 22736 1727204255.51555: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204255.51570: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204255.51591: variable 'omit' from source: magic vars 22736 1727204255.52065: variable 'ansible_distribution_major_version' from source: facts 22736 1727204255.52084: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204255.52254: variable 'network_provider' from source: set_fact 22736 1727204255.52268: Evaluated conditional (network_provider == "initscripts"): False 22736 1727204255.52276: when evaluation is False, skipping this task 22736 1727204255.52285: _execute() done 22736 1727204255.52296: dumping result to json 22736 1727204255.52304: done dumping result, returning 22736 1727204255.52353: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-4f4a-548a-000000000027] 22736 1727204255.52358: sending task result for task 12b410aa-8751-4f4a-548a-000000000027 skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22736 1727204255.52521: no more pending results, returning what we have 22736 1727204255.52525: results queue empty 22736 1727204255.52526: checking for any_errors_fatal 22736 1727204255.52540: done checking for any_errors_fatal 22736 1727204255.52541: checking for max_fail_percentage 22736 1727204255.52543: done checking for max_fail_percentage 22736 1727204255.52544: checking to see if all hosts have failed and the running result is not ok 22736 1727204255.52545: done checking to see if all hosts have failed 22736 1727204255.52546: getting the remaining hosts for this loop 22736 1727204255.52548: done getting the remaining hosts for this loop 22736 1727204255.52553: getting the next task for host managed-node2 22736 1727204255.52561: done getting next task for host managed-node2 22736 1727204255.52595: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22736 1727204255.52600: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204255.52618: getting variables 22736 1727204255.52620: in VariableManager get_vars() 22736 1727204255.52664: Calling all_inventory to load vars for managed-node2 22736 1727204255.52668: Calling groups_inventory to load vars for managed-node2 22736 1727204255.52671: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204255.52795: done sending task result for task 12b410aa-8751-4f4a-548a-000000000027 22736 1727204255.52799: WORKER PROCESS EXITING 22736 1727204255.52815: Calling all_plugins_play to load vars for managed-node2 22736 1727204255.52820: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204255.52825: Calling groups_plugins_play to load vars for managed-node2 22736 1727204255.55421: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204255.58568: done with get_vars() 22736 1727204255.58619: done getting variables 22736 1727204255.58698: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:35 -0400 (0:00:00.082) 0:00:20.372 ***** 22736 1727204255.58735: entering _queue_task() for managed-node2/copy 22736 1727204255.59318: worker is 1 (out of 1 available) 22736 1727204255.59330: exiting _queue_task() for managed-node2/copy 22736 1727204255.59344: done queuing things up, now waiting for results queue to drain 22736 1727204255.59345: waiting for pending results... 22736 1727204255.59471: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22736 1727204255.59601: in run() - task 12b410aa-8751-4f4a-548a-000000000028 22736 1727204255.59626: variable 'ansible_search_path' from source: unknown 22736 1727204255.59634: variable 'ansible_search_path' from source: unknown 22736 1727204255.59685: calling self._execute() 22736 1727204255.59805: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204255.59820: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204255.59897: variable 'omit' from source: magic vars 22736 1727204255.60296: variable 'ansible_distribution_major_version' from source: facts 22736 1727204255.60318: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204255.60476: variable 'network_provider' from source: set_fact 22736 1727204255.60488: Evaluated conditional (network_provider == "initscripts"): False 22736 1727204255.60498: when evaluation is False, skipping this task 22736 1727204255.60506: _execute() done 22736 1727204255.60513: dumping result to json 22736 1727204255.60520: done dumping result, returning 22736 1727204255.60532: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-4f4a-548a-000000000028] 22736 1727204255.60540: sending task result for task 12b410aa-8751-4f4a-548a-000000000028 skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 22736 1727204255.60841: no more pending results, returning what we have 22736 1727204255.60846: results queue empty 22736 1727204255.60847: checking for any_errors_fatal 22736 1727204255.60856: done checking for any_errors_fatal 22736 1727204255.60857: checking for max_fail_percentage 22736 1727204255.60859: done checking for max_fail_percentage 22736 1727204255.60861: checking to see if all hosts have failed and the running result is not ok 22736 1727204255.60862: done checking to see if all hosts have failed 22736 1727204255.60863: getting the remaining hosts for this loop 22736 1727204255.60865: done getting the remaining hosts for this loop 22736 1727204255.60870: getting the next task for host managed-node2 22736 1727204255.60880: done getting next task for host managed-node2 22736 1727204255.60885: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22736 1727204255.60888: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204255.60909: getting variables 22736 1727204255.60912: in VariableManager get_vars() 22736 1727204255.60961: Calling all_inventory to load vars for managed-node2 22736 1727204255.60965: Calling groups_inventory to load vars for managed-node2 22736 1727204255.60968: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204255.60986: Calling all_plugins_play to load vars for managed-node2 22736 1727204255.61209: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204255.61217: Calling groups_plugins_play to load vars for managed-node2 22736 1727204255.61828: done sending task result for task 12b410aa-8751-4f4a-548a-000000000028 22736 1727204255.61831: WORKER PROCESS EXITING 22736 1727204255.63608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204255.66475: done with get_vars() 22736 1727204255.66531: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:35 -0400 (0:00:00.079) 0:00:20.451 ***** 22736 1727204255.66659: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 22736 1727204255.66662: Creating lock for fedora.linux_system_roles.network_connections 22736 1727204255.67062: worker is 1 (out of 1 available) 22736 1727204255.67076: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 22736 1727204255.67206: done queuing things up, now waiting for results queue to drain 22736 1727204255.67209: waiting for pending results... 22736 1727204255.67413: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22736 1727204255.67545: in run() - task 12b410aa-8751-4f4a-548a-000000000029 22736 1727204255.67574: variable 'ansible_search_path' from source: unknown 22736 1727204255.67582: variable 'ansible_search_path' from source: unknown 22736 1727204255.67633: calling self._execute() 22736 1727204255.67750: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204255.67868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204255.67873: variable 'omit' from source: magic vars 22736 1727204255.68254: variable 'ansible_distribution_major_version' from source: facts 22736 1727204255.68273: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204255.68285: variable 'omit' from source: magic vars 22736 1727204255.68345: variable 'omit' from source: magic vars 22736 1727204255.68555: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204255.71162: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204255.71263: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204255.71319: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204255.71375: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204255.71416: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204255.71530: variable 'network_provider' from source: set_fact 22736 1727204255.71726: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204255.71797: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204255.71830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204255.71907: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204255.71929: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204255.72095: variable 'omit' from source: magic vars 22736 1727204255.72205: variable 'omit' from source: magic vars 22736 1727204255.72361: variable 'network_connections' from source: play vars 22736 1727204255.72383: variable 'interface' from source: set_fact 22736 1727204255.72484: variable 'interface' from source: set_fact 22736 1727204255.72502: variable 'interface' from source: set_fact 22736 1727204255.72593: variable 'interface' from source: set_fact 22736 1727204255.72813: variable 'omit' from source: magic vars 22736 1727204255.72831: variable '__lsr_ansible_managed' from source: task vars 22736 1727204255.72920: variable '__lsr_ansible_managed' from source: task vars 22736 1727204255.73325: Loaded config def from plugin (lookup/template) 22736 1727204255.73328: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 22736 1727204255.73496: File lookup term: get_ansible_managed.j2 22736 1727204255.73500: variable 'ansible_search_path' from source: unknown 22736 1727204255.73503: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 22736 1727204255.73507: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 22736 1727204255.73510: variable 'ansible_search_path' from source: unknown 22736 1727204255.81564: variable 'ansible_managed' from source: unknown 22736 1727204255.81716: variable 'omit' from source: magic vars 22736 1727204255.81745: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204255.81769: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204255.81785: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204255.81803: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204255.81813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204255.81841: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204255.81846: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204255.81849: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204255.81929: Set connection var ansible_timeout to 10 22736 1727204255.81940: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204255.81950: Set connection var ansible_shell_executable to /bin/sh 22736 1727204255.81952: Set connection var ansible_shell_type to sh 22736 1727204255.81965: Set connection var ansible_pipelining to False 22736 1727204255.81968: Set connection var ansible_connection to ssh 22736 1727204255.81984: variable 'ansible_shell_executable' from source: unknown 22736 1727204255.81987: variable 'ansible_connection' from source: unknown 22736 1727204255.81992: variable 'ansible_module_compression' from source: unknown 22736 1727204255.81997: variable 'ansible_shell_type' from source: unknown 22736 1727204255.82000: variable 'ansible_shell_executable' from source: unknown 22736 1727204255.82005: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204255.82010: variable 'ansible_pipelining' from source: unknown 22736 1727204255.82014: variable 'ansible_timeout' from source: unknown 22736 1727204255.82021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204255.82135: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204255.82146: variable 'omit' from source: magic vars 22736 1727204255.82153: starting attempt loop 22736 1727204255.82156: running the handler 22736 1727204255.82170: _low_level_execute_command(): starting 22736 1727204255.82183: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204255.82684: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204255.82723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204255.82726: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204255.82729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204255.82731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204255.82787: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204255.82796: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204255.82842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204255.84620: stdout chunk (state=3): >>>/root <<< 22736 1727204255.84727: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204255.84782: stderr chunk (state=3): >>><<< 22736 1727204255.84792: stdout chunk (state=3): >>><<< 22736 1727204255.84812: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204255.84827: _low_level_execute_command(): starting 22736 1727204255.84834: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204255.8481467-23618-141882946768347 `" && echo ansible-tmp-1727204255.8481467-23618-141882946768347="` echo /root/.ansible/tmp/ansible-tmp-1727204255.8481467-23618-141882946768347 `" ) && sleep 0' 22736 1727204255.85281: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204255.85325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204255.85328: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204255.85331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204255.85333: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204255.85335: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204255.85385: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204255.85391: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204255.85433: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204255.87464: stdout chunk (state=3): >>>ansible-tmp-1727204255.8481467-23618-141882946768347=/root/.ansible/tmp/ansible-tmp-1727204255.8481467-23618-141882946768347 <<< 22736 1727204255.87580: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204255.87794: stderr chunk (state=3): >>><<< 22736 1727204255.87798: stdout chunk (state=3): >>><<< 22736 1727204255.87801: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204255.8481467-23618-141882946768347=/root/.ansible/tmp/ansible-tmp-1727204255.8481467-23618-141882946768347 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204255.87804: variable 'ansible_module_compression' from source: unknown 22736 1727204255.87806: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 22736 1727204255.87808: ANSIBALLZ: Acquiring lock 22736 1727204255.87810: ANSIBALLZ: Lock acquired: 140553535186816 22736 1727204255.87812: ANSIBALLZ: Creating module 22736 1727204256.06643: ANSIBALLZ: Writing module into payload 22736 1727204256.07129: ANSIBALLZ: Writing module 22736 1727204256.07158: ANSIBALLZ: Renaming module 22736 1727204256.07162: ANSIBALLZ: Done creating module 22736 1727204256.07303: variable 'ansible_facts' from source: unknown 22736 1727204256.07318: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204255.8481467-23618-141882946768347/AnsiballZ_network_connections.py 22736 1727204256.07574: Sending initial data 22736 1727204256.07577: Sent initial data (168 bytes) 22736 1727204256.08316: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204256.08325: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204256.08404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204256.08426: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204256.10236: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204256.10318: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204256.10350: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp78qbzk_8 /root/.ansible/tmp/ansible-tmp-1727204255.8481467-23618-141882946768347/AnsiballZ_network_connections.py <<< 22736 1727204256.10354: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204255.8481467-23618-141882946768347/AnsiballZ_network_connections.py" <<< 22736 1727204256.10380: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 22736 1727204256.10399: stderr chunk (state=3): >>>debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp78qbzk_8" to remote "/root/.ansible/tmp/ansible-tmp-1727204255.8481467-23618-141882946768347/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204255.8481467-23618-141882946768347/AnsiballZ_network_connections.py" <<< 22736 1727204256.11648: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204256.11721: stderr chunk (state=3): >>><<< 22736 1727204256.11725: stdout chunk (state=3): >>><<< 22736 1727204256.11750: done transferring module to remote 22736 1727204256.11761: _low_level_execute_command(): starting 22736 1727204256.11767: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204255.8481467-23618-141882946768347/ /root/.ansible/tmp/ansible-tmp-1727204255.8481467-23618-141882946768347/AnsiballZ_network_connections.py && sleep 0' 22736 1727204256.12261: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204256.12264: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204256.12267: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204256.12270: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204256.12272: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204256.12333: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204256.12339: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204256.12378: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204256.14341: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204256.14398: stderr chunk (state=3): >>><<< 22736 1727204256.14401: stdout chunk (state=3): >>><<< 22736 1727204256.14418: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204256.14422: _low_level_execute_command(): starting 22736 1727204256.14431: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204255.8481467-23618-141882946768347/AnsiballZ_network_connections.py && sleep 0' 22736 1727204256.14886: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204256.14924: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204256.14927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204256.14930: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204256.14932: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204256.14935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204256.14996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204256.15001: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204256.15048: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204256.50838: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 752bdb29-49cb-43fb-bb8f-6bafcdca1322\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 752bdb29-49cb-43fb-bb8f-6bafcdca1322 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}}<<< 22736 1727204256.50886: stdout chunk (state=3): >>> <<< 22736 1727204256.53113: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204256.53155: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 22736 1727204256.53158: stdout chunk (state=3): >>><<< 22736 1727204256.53161: stderr chunk (state=3): >>><<< 22736 1727204256.53219: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 752bdb29-49cb-43fb-bb8f-6bafcdca1322\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 752bdb29-49cb-43fb-bb8f-6bafcdca1322 (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "interface_name": "lsr27", "state": "up", "type": "ethernet", "autoconnect": true, "ip": {"address": "192.0.2.1/24"}}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204256.53265: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'interface_name': 'lsr27', 'state': 'up', 'type': 'ethernet', 'autoconnect': True, 'ip': {'address': '192.0.2.1/24'}}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204255.8481467-23618-141882946768347/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204256.53288: _low_level_execute_command(): starting 22736 1727204256.53302: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204255.8481467-23618-141882946768347/ > /dev/null 2>&1 && sleep 0' 22736 1727204256.54008: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204256.54025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204256.54041: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204256.54069: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204256.54274: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204256.54414: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204256.54456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204256.56530: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204256.56688: stderr chunk (state=3): >>><<< 22736 1727204256.56704: stdout chunk (state=3): >>><<< 22736 1727204256.57103: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204256.57112: handler run complete 22736 1727204256.57115: attempt loop complete, returning result 22736 1727204256.57117: _execute() done 22736 1727204256.57119: dumping result to json 22736 1727204256.57121: done dumping result, returning 22736 1727204256.57124: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-4f4a-548a-000000000029] 22736 1727204256.57126: sending task result for task 12b410aa-8751-4f4a-548a-000000000029 changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 752bdb29-49cb-43fb-bb8f-6bafcdca1322 [004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 752bdb29-49cb-43fb-bb8f-6bafcdca1322 (not-active) 22736 1727204256.57363: no more pending results, returning what we have 22736 1727204256.57367: results queue empty 22736 1727204256.57368: checking for any_errors_fatal 22736 1727204256.57379: done checking for any_errors_fatal 22736 1727204256.57380: checking for max_fail_percentage 22736 1727204256.57382: done checking for max_fail_percentage 22736 1727204256.57383: checking to see if all hosts have failed and the running result is not ok 22736 1727204256.57384: done checking to see if all hosts have failed 22736 1727204256.57385: getting the remaining hosts for this loop 22736 1727204256.57387: done getting the remaining hosts for this loop 22736 1727204256.57618: getting the next task for host managed-node2 22736 1727204256.57626: done getting next task for host managed-node2 22736 1727204256.57631: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 22736 1727204256.57634: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204256.57648: getting variables 22736 1727204256.57650: in VariableManager get_vars() 22736 1727204256.57698: Calling all_inventory to load vars for managed-node2 22736 1727204256.57702: Calling groups_inventory to load vars for managed-node2 22736 1727204256.57705: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204256.57837: Calling all_plugins_play to load vars for managed-node2 22736 1727204256.57843: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204256.57897: Calling groups_plugins_play to load vars for managed-node2 22736 1727204256.58792: done sending task result for task 12b410aa-8751-4f4a-548a-000000000029 22736 1727204256.58797: WORKER PROCESS EXITING 22736 1727204256.63094: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204256.69165: done with get_vars() 22736 1727204256.69527: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:36 -0400 (0:00:01.030) 0:00:21.482 ***** 22736 1727204256.69750: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 22736 1727204256.69753: Creating lock for fedora.linux_system_roles.network_state 22736 1727204256.70536: worker is 1 (out of 1 available) 22736 1727204256.70553: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 22736 1727204256.70567: done queuing things up, now waiting for results queue to drain 22736 1727204256.70569: waiting for pending results... 22736 1727204256.71083: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 22736 1727204256.71186: in run() - task 12b410aa-8751-4f4a-548a-00000000002a 22736 1727204256.71408: variable 'ansible_search_path' from source: unknown 22736 1727204256.71412: variable 'ansible_search_path' from source: unknown 22736 1727204256.71457: calling self._execute() 22736 1727204256.71575: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204256.71588: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204256.71799: variable 'omit' from source: magic vars 22736 1727204256.72645: variable 'ansible_distribution_major_version' from source: facts 22736 1727204256.72680: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204256.73010: variable 'network_state' from source: role '' defaults 22736 1727204256.73098: Evaluated conditional (network_state != {}): False 22736 1727204256.73101: when evaluation is False, skipping this task 22736 1727204256.73104: _execute() done 22736 1727204256.73106: dumping result to json 22736 1727204256.73108: done dumping result, returning 22736 1727204256.73115: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-4f4a-548a-00000000002a] 22736 1727204256.73118: sending task result for task 12b410aa-8751-4f4a-548a-00000000002a 22736 1727204256.73182: done sending task result for task 12b410aa-8751-4f4a-548a-00000000002a 22736 1727204256.73185: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22736 1727204256.73264: no more pending results, returning what we have 22736 1727204256.73269: results queue empty 22736 1727204256.73270: checking for any_errors_fatal 22736 1727204256.73282: done checking for any_errors_fatal 22736 1727204256.73283: checking for max_fail_percentage 22736 1727204256.73285: done checking for max_fail_percentage 22736 1727204256.73287: checking to see if all hosts have failed and the running result is not ok 22736 1727204256.73288: done checking to see if all hosts have failed 22736 1727204256.73291: getting the remaining hosts for this loop 22736 1727204256.73293: done getting the remaining hosts for this loop 22736 1727204256.73298: getting the next task for host managed-node2 22736 1727204256.73306: done getting next task for host managed-node2 22736 1727204256.73311: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22736 1727204256.73314: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204256.73333: getting variables 22736 1727204256.73335: in VariableManager get_vars() 22736 1727204256.73380: Calling all_inventory to load vars for managed-node2 22736 1727204256.73384: Calling groups_inventory to load vars for managed-node2 22736 1727204256.73387: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204256.73515: Calling all_plugins_play to load vars for managed-node2 22736 1727204256.73519: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204256.73523: Calling groups_plugins_play to load vars for managed-node2 22736 1727204256.78375: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204256.84660: done with get_vars() 22736 1727204256.84763: done getting variables 22736 1727204256.84839: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:36 -0400 (0:00:00.152) 0:00:21.634 ***** 22736 1727204256.84993: entering _queue_task() for managed-node2/debug 22736 1727204256.85770: worker is 1 (out of 1 available) 22736 1727204256.85785: exiting _queue_task() for managed-node2/debug 22736 1727204256.85915: done queuing things up, now waiting for results queue to drain 22736 1727204256.85917: waiting for pending results... 22736 1727204256.86369: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22736 1727204256.86681: in run() - task 12b410aa-8751-4f4a-548a-00000000002b 22736 1727204256.86687: variable 'ansible_search_path' from source: unknown 22736 1727204256.86693: variable 'ansible_search_path' from source: unknown 22736 1727204256.86696: calling self._execute() 22736 1727204256.86788: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204256.86999: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204256.87011: variable 'omit' from source: magic vars 22736 1727204256.87862: variable 'ansible_distribution_major_version' from source: facts 22736 1727204256.87880: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204256.87883: variable 'omit' from source: magic vars 22736 1727204256.87990: variable 'omit' from source: magic vars 22736 1727204256.87995: variable 'omit' from source: magic vars 22736 1727204256.88222: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204256.88263: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204256.88285: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204256.88513: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204256.88534: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204256.88566: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204256.88569: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204256.88594: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204256.88702: Set connection var ansible_timeout to 10 22736 1727204256.88720: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204256.88752: Set connection var ansible_shell_executable to /bin/sh 22736 1727204256.88755: Set connection var ansible_shell_type to sh 22736 1727204256.88758: Set connection var ansible_pipelining to False 22736 1727204256.88760: Set connection var ansible_connection to ssh 22736 1727204256.88774: variable 'ansible_shell_executable' from source: unknown 22736 1727204256.88777: variable 'ansible_connection' from source: unknown 22736 1727204256.88780: variable 'ansible_module_compression' from source: unknown 22736 1727204256.88783: variable 'ansible_shell_type' from source: unknown 22736 1727204256.88785: variable 'ansible_shell_executable' from source: unknown 22736 1727204256.88862: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204256.89002: variable 'ansible_pipelining' from source: unknown 22736 1727204256.89006: variable 'ansible_timeout' from source: unknown 22736 1727204256.89014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204256.89387: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204256.89406: variable 'omit' from source: magic vars 22736 1727204256.89410: starting attempt loop 22736 1727204256.89413: running the handler 22736 1727204256.89571: variable '__network_connections_result' from source: set_fact 22736 1727204256.89841: handler run complete 22736 1727204256.89895: attempt loop complete, returning result 22736 1727204256.89898: _execute() done 22736 1727204256.89901: dumping result to json 22736 1727204256.89904: done dumping result, returning 22736 1727204256.89907: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-4f4a-548a-00000000002b] 22736 1727204256.89909: sending task result for task 12b410aa-8751-4f4a-548a-00000000002b 22736 1727204256.90012: done sending task result for task 12b410aa-8751-4f4a-548a-00000000002b 22736 1727204256.90016: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 752bdb29-49cb-43fb-bb8f-6bafcdca1322", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 752bdb29-49cb-43fb-bb8f-6bafcdca1322 (not-active)" ] } 22736 1727204256.90210: no more pending results, returning what we have 22736 1727204256.90214: results queue empty 22736 1727204256.90215: checking for any_errors_fatal 22736 1727204256.90222: done checking for any_errors_fatal 22736 1727204256.90222: checking for max_fail_percentage 22736 1727204256.90224: done checking for max_fail_percentage 22736 1727204256.90225: checking to see if all hosts have failed and the running result is not ok 22736 1727204256.90226: done checking to see if all hosts have failed 22736 1727204256.90227: getting the remaining hosts for this loop 22736 1727204256.90229: done getting the remaining hosts for this loop 22736 1727204256.90233: getting the next task for host managed-node2 22736 1727204256.90240: done getting next task for host managed-node2 22736 1727204256.90244: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22736 1727204256.90246: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204256.90257: getting variables 22736 1727204256.90259: in VariableManager get_vars() 22736 1727204256.90415: Calling all_inventory to load vars for managed-node2 22736 1727204256.90419: Calling groups_inventory to load vars for managed-node2 22736 1727204256.90422: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204256.90433: Calling all_plugins_play to load vars for managed-node2 22736 1727204256.90437: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204256.90441: Calling groups_plugins_play to load vars for managed-node2 22736 1727204256.95249: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204257.01568: done with get_vars() 22736 1727204257.01625: done getting variables 22736 1727204257.01820: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.168) 0:00:21.803 ***** 22736 1727204257.01856: entering _queue_task() for managed-node2/debug 22736 1727204257.02615: worker is 1 (out of 1 available) 22736 1727204257.02632: exiting _queue_task() for managed-node2/debug 22736 1727204257.02645: done queuing things up, now waiting for results queue to drain 22736 1727204257.02647: waiting for pending results... 22736 1727204257.03408: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22736 1727204257.03414: in run() - task 12b410aa-8751-4f4a-548a-00000000002c 22736 1727204257.03521: variable 'ansible_search_path' from source: unknown 22736 1727204257.03526: variable 'ansible_search_path' from source: unknown 22736 1727204257.03568: calling self._execute() 22736 1727204257.03756: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204257.03761: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204257.03764: variable 'omit' from source: magic vars 22736 1727204257.04850: variable 'ansible_distribution_major_version' from source: facts 22736 1727204257.04863: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204257.04871: variable 'omit' from source: magic vars 22736 1727204257.04924: variable 'omit' from source: magic vars 22736 1727204257.04971: variable 'omit' from source: magic vars 22736 1727204257.05280: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204257.05284: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204257.05288: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204257.05305: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204257.05400: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204257.05404: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204257.05407: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204257.05409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204257.05697: Set connection var ansible_timeout to 10 22736 1727204257.05712: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204257.05726: Set connection var ansible_shell_executable to /bin/sh 22736 1727204257.05729: Set connection var ansible_shell_type to sh 22736 1727204257.05740: Set connection var ansible_pipelining to False 22736 1727204257.05742: Set connection var ansible_connection to ssh 22736 1727204257.05769: variable 'ansible_shell_executable' from source: unknown 22736 1727204257.05772: variable 'ansible_connection' from source: unknown 22736 1727204257.05776: variable 'ansible_module_compression' from source: unknown 22736 1727204257.05778: variable 'ansible_shell_type' from source: unknown 22736 1727204257.05835: variable 'ansible_shell_executable' from source: unknown 22736 1727204257.05838: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204257.05898: variable 'ansible_pipelining' from source: unknown 22736 1727204257.05902: variable 'ansible_timeout' from source: unknown 22736 1727204257.05910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204257.06077: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204257.06091: variable 'omit' from source: magic vars 22736 1727204257.06381: starting attempt loop 22736 1727204257.06386: running the handler 22736 1727204257.06389: variable '__network_connections_result' from source: set_fact 22736 1727204257.06463: variable '__network_connections_result' from source: set_fact 22736 1727204257.06828: handler run complete 22736 1727204257.06865: attempt loop complete, returning result 22736 1727204257.06869: _execute() done 22736 1727204257.06872: dumping result to json 22736 1727204257.06882: done dumping result, returning 22736 1727204257.06894: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-4f4a-548a-00000000002c] 22736 1727204257.07105: sending task result for task 12b410aa-8751-4f4a-548a-00000000002c 22736 1727204257.07517: done sending task result for task 12b410aa-8751-4f4a-548a-00000000002c 22736 1727204257.07521: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "autoconnect": true, "interface_name": "lsr27", "ip": { "address": "192.0.2.1/24" }, "name": "lsr27", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 752bdb29-49cb-43fb-bb8f-6bafcdca1322\n[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 752bdb29-49cb-43fb-bb8f-6bafcdca1322 (not-active)\n", "stderr_lines": [ "[003] #0, state:up persistent_state:present, 'lsr27': add connection lsr27, 752bdb29-49cb-43fb-bb8f-6bafcdca1322", "[004] #0, state:up persistent_state:present, 'lsr27': up connection lsr27, 752bdb29-49cb-43fb-bb8f-6bafcdca1322 (not-active)" ] } } 22736 1727204257.07640: no more pending results, returning what we have 22736 1727204257.07643: results queue empty 22736 1727204257.07644: checking for any_errors_fatal 22736 1727204257.07651: done checking for any_errors_fatal 22736 1727204257.07652: checking for max_fail_percentage 22736 1727204257.07654: done checking for max_fail_percentage 22736 1727204257.07655: checking to see if all hosts have failed and the running result is not ok 22736 1727204257.07656: done checking to see if all hosts have failed 22736 1727204257.07657: getting the remaining hosts for this loop 22736 1727204257.07659: done getting the remaining hosts for this loop 22736 1727204257.07664: getting the next task for host managed-node2 22736 1727204257.07670: done getting next task for host managed-node2 22736 1727204257.07674: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22736 1727204257.07677: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204257.07687: getting variables 22736 1727204257.07688: in VariableManager get_vars() 22736 1727204257.07727: Calling all_inventory to load vars for managed-node2 22736 1727204257.07730: Calling groups_inventory to load vars for managed-node2 22736 1727204257.07733: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204257.07858: Calling all_plugins_play to load vars for managed-node2 22736 1727204257.07863: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204257.07868: Calling groups_plugins_play to load vars for managed-node2 22736 1727204257.12422: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204257.18674: done with get_vars() 22736 1727204257.18845: done getting variables 22736 1727204257.19008: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.171) 0:00:21.975 ***** 22736 1727204257.19050: entering _queue_task() for managed-node2/debug 22736 1727204257.20006: worker is 1 (out of 1 available) 22736 1727204257.20026: exiting _queue_task() for managed-node2/debug 22736 1727204257.20039: done queuing things up, now waiting for results queue to drain 22736 1727204257.20040: waiting for pending results... 22736 1727204257.20451: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22736 1727204257.20786: in run() - task 12b410aa-8751-4f4a-548a-00000000002d 22736 1727204257.21098: variable 'ansible_search_path' from source: unknown 22736 1727204257.21103: variable 'ansible_search_path' from source: unknown 22736 1727204257.21106: calling self._execute() 22736 1727204257.21179: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204257.21219: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204257.21237: variable 'omit' from source: magic vars 22736 1727204257.22191: variable 'ansible_distribution_major_version' from source: facts 22736 1727204257.22216: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204257.22565: variable 'network_state' from source: role '' defaults 22736 1727204257.22581: Evaluated conditional (network_state != {}): False 22736 1727204257.22592: when evaluation is False, skipping this task 22736 1727204257.22600: _execute() done 22736 1727204257.22631: dumping result to json 22736 1727204257.22642: done dumping result, returning 22736 1727204257.22654: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-4f4a-548a-00000000002d] 22736 1727204257.22734: sending task result for task 12b410aa-8751-4f4a-548a-00000000002d 22736 1727204257.23028: done sending task result for task 12b410aa-8751-4f4a-548a-00000000002d 22736 1727204257.23032: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 22736 1727204257.23087: no more pending results, returning what we have 22736 1727204257.23093: results queue empty 22736 1727204257.23094: checking for any_errors_fatal 22736 1727204257.23104: done checking for any_errors_fatal 22736 1727204257.23105: checking for max_fail_percentage 22736 1727204257.23106: done checking for max_fail_percentage 22736 1727204257.23108: checking to see if all hosts have failed and the running result is not ok 22736 1727204257.23109: done checking to see if all hosts have failed 22736 1727204257.23110: getting the remaining hosts for this loop 22736 1727204257.23111: done getting the remaining hosts for this loop 22736 1727204257.23118: getting the next task for host managed-node2 22736 1727204257.23125: done getting next task for host managed-node2 22736 1727204257.23129: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 22736 1727204257.23132: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204257.23149: getting variables 22736 1727204257.23150: in VariableManager get_vars() 22736 1727204257.23393: Calling all_inventory to load vars for managed-node2 22736 1727204257.23398: Calling groups_inventory to load vars for managed-node2 22736 1727204257.23401: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204257.23415: Calling all_plugins_play to load vars for managed-node2 22736 1727204257.23419: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204257.23423: Calling groups_plugins_play to load vars for managed-node2 22736 1727204257.28203: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204257.34270: done with get_vars() 22736 1727204257.34326: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:37 -0400 (0:00:00.155) 0:00:22.131 ***** 22736 1727204257.34648: entering _queue_task() for managed-node2/ping 22736 1727204257.34651: Creating lock for ping 22736 1727204257.35046: worker is 1 (out of 1 available) 22736 1727204257.35061: exiting _queue_task() for managed-node2/ping 22736 1727204257.35075: done queuing things up, now waiting for results queue to drain 22736 1727204257.35077: waiting for pending results... 22736 1727204257.35449: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 22736 1727204257.35598: in run() - task 12b410aa-8751-4f4a-548a-00000000002e 22736 1727204257.35603: variable 'ansible_search_path' from source: unknown 22736 1727204257.35606: variable 'ansible_search_path' from source: unknown 22736 1727204257.35610: calling self._execute() 22736 1727204257.35677: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204257.35684: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204257.35698: variable 'omit' from source: magic vars 22736 1727204257.36295: variable 'ansible_distribution_major_version' from source: facts 22736 1727204257.36299: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204257.36303: variable 'omit' from source: magic vars 22736 1727204257.36306: variable 'omit' from source: magic vars 22736 1727204257.36309: variable 'omit' from source: magic vars 22736 1727204257.36311: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204257.36360: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204257.36382: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204257.36405: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204257.36645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204257.36649: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204257.36652: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204257.36655: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204257.36658: Set connection var ansible_timeout to 10 22736 1727204257.36662: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204257.36665: Set connection var ansible_shell_executable to /bin/sh 22736 1727204257.36668: Set connection var ansible_shell_type to sh 22736 1727204257.36670: Set connection var ansible_pipelining to False 22736 1727204257.36673: Set connection var ansible_connection to ssh 22736 1727204257.36676: variable 'ansible_shell_executable' from source: unknown 22736 1727204257.36679: variable 'ansible_connection' from source: unknown 22736 1727204257.36691: variable 'ansible_module_compression' from source: unknown 22736 1727204257.36694: variable 'ansible_shell_type' from source: unknown 22736 1727204257.36699: variable 'ansible_shell_executable' from source: unknown 22736 1727204257.36703: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204257.36709: variable 'ansible_pipelining' from source: unknown 22736 1727204257.36713: variable 'ansible_timeout' from source: unknown 22736 1727204257.36722: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204257.37083: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204257.37087: variable 'omit' from source: magic vars 22736 1727204257.37093: starting attempt loop 22736 1727204257.37096: running the handler 22736 1727204257.37099: _low_level_execute_command(): starting 22736 1727204257.37101: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204257.37780: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204257.37901: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204257.37923: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204257.38001: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204257.39777: stdout chunk (state=3): >>>/root <<< 22736 1727204257.39967: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204257.39974: stdout chunk (state=3): >>><<< 22736 1727204257.40099: stderr chunk (state=3): >>><<< 22736 1727204257.40126: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204257.40196: _low_level_execute_command(): starting 22736 1727204257.40200: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204257.4012587-23788-57408064063008 `" && echo ansible-tmp-1727204257.4012587-23788-57408064063008="` echo /root/.ansible/tmp/ansible-tmp-1727204257.4012587-23788-57408064063008 `" ) && sleep 0' 22736 1727204257.41296: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204257.41301: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204257.41367: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204257.41374: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204257.41496: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204257.41507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204257.41664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204257.41707: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204257.41787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204257.43877: stdout chunk (state=3): >>>ansible-tmp-1727204257.4012587-23788-57408064063008=/root/.ansible/tmp/ansible-tmp-1727204257.4012587-23788-57408064063008 <<< 22736 1727204257.44002: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204257.44193: stderr chunk (state=3): >>><<< 22736 1727204257.44196: stdout chunk (state=3): >>><<< 22736 1727204257.44221: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204257.4012587-23788-57408064063008=/root/.ansible/tmp/ansible-tmp-1727204257.4012587-23788-57408064063008 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204257.44277: variable 'ansible_module_compression' from source: unknown 22736 1727204257.44331: ANSIBALLZ: Using lock for ping 22736 1727204257.44335: ANSIBALLZ: Acquiring lock 22736 1727204257.44337: ANSIBALLZ: Lock acquired: 140553535185088 22736 1727204257.44343: ANSIBALLZ: Creating module 22736 1727204257.89563: ANSIBALLZ: Writing module into payload 22736 1727204257.89728: ANSIBALLZ: Writing module 22736 1727204257.89805: ANSIBALLZ: Renaming module 22736 1727204257.90001: ANSIBALLZ: Done creating module 22736 1727204257.90004: variable 'ansible_facts' from source: unknown 22736 1727204257.90070: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204257.4012587-23788-57408064063008/AnsiballZ_ping.py 22736 1727204257.90457: Sending initial data 22736 1727204257.90550: Sent initial data (152 bytes) 22736 1727204257.91818: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204257.91838: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204257.91865: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204257.91975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204257.92014: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204257.92200: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204257.92250: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204257.94028: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204257.94054: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204257.94126: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp2v91qhby /root/.ansible/tmp/ansible-tmp-1727204257.4012587-23788-57408064063008/AnsiballZ_ping.py <<< 22736 1727204257.94131: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204257.4012587-23788-57408064063008/AnsiballZ_ping.py" <<< 22736 1727204257.94237: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp2v91qhby" to remote "/root/.ansible/tmp/ansible-tmp-1727204257.4012587-23788-57408064063008/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204257.4012587-23788-57408064063008/AnsiballZ_ping.py" <<< 22736 1727204257.96312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204257.96317: stdout chunk (state=3): >>><<< 22736 1727204257.96319: stderr chunk (state=3): >>><<< 22736 1727204257.96321: done transferring module to remote 22736 1727204257.96324: _low_level_execute_command(): starting 22736 1727204257.96327: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204257.4012587-23788-57408064063008/ /root/.ansible/tmp/ansible-tmp-1727204257.4012587-23788-57408064063008/AnsiballZ_ping.py && sleep 0' 22736 1727204257.97672: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204257.97676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204257.97779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204257.97784: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204257.97787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204257.98057: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204257.98097: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204257.98176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204258.00349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204258.00354: stderr chunk (state=3): >>><<< 22736 1727204258.00356: stdout chunk (state=3): >>><<< 22736 1727204258.00377: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204258.00380: _low_level_execute_command(): starting 22736 1727204258.00406: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204257.4012587-23788-57408064063008/AnsiballZ_ping.py && sleep 0' 22736 1727204258.01763: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204258.01767: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204258.01895: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204258.01899: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204258.01975: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204258.01979: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204258.02071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204258.19733: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 22736 1727204258.21455: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204258.21461: stdout chunk (state=3): >>><<< 22736 1727204258.21463: stderr chunk (state=3): >>><<< 22736 1727204258.21496: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204258.21805: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204257.4012587-23788-57408064063008/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204258.21810: _low_level_execute_command(): starting 22736 1727204258.21813: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204257.4012587-23788-57408064063008/ > /dev/null 2>&1 && sleep 0' 22736 1727204258.23414: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204258.23474: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204258.23545: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204258.23800: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204258.23848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204258.24003: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204258.26055: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204258.26118: stderr chunk (state=3): >>><<< 22736 1727204258.26128: stdout chunk (state=3): >>><<< 22736 1727204258.26160: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204258.26178: handler run complete 22736 1727204258.26204: attempt loop complete, returning result 22736 1727204258.26263: _execute() done 22736 1727204258.26266: dumping result to json 22736 1727204258.26268: done dumping result, returning 22736 1727204258.26270: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-4f4a-548a-00000000002e] 22736 1727204258.26273: sending task result for task 12b410aa-8751-4f4a-548a-00000000002e ok: [managed-node2] => { "changed": false, "ping": "pong" } 22736 1727204258.26435: no more pending results, returning what we have 22736 1727204258.26439: results queue empty 22736 1727204258.26440: checking for any_errors_fatal 22736 1727204258.26448: done checking for any_errors_fatal 22736 1727204258.26449: checking for max_fail_percentage 22736 1727204258.26451: done checking for max_fail_percentage 22736 1727204258.26452: checking to see if all hosts have failed and the running result is not ok 22736 1727204258.26453: done checking to see if all hosts have failed 22736 1727204258.26454: getting the remaining hosts for this loop 22736 1727204258.26456: done getting the remaining hosts for this loop 22736 1727204258.26460: getting the next task for host managed-node2 22736 1727204258.26468: done getting next task for host managed-node2 22736 1727204258.26471: ^ task is: TASK: meta (role_complete) 22736 1727204258.26474: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204258.26486: getting variables 22736 1727204258.26488: in VariableManager get_vars() 22736 1727204258.26554: Calling all_inventory to load vars for managed-node2 22736 1727204258.26558: Calling groups_inventory to load vars for managed-node2 22736 1727204258.26561: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204258.26568: done sending task result for task 12b410aa-8751-4f4a-548a-00000000002e 22736 1727204258.26571: WORKER PROCESS EXITING 22736 1727204258.26581: Calling all_plugins_play to load vars for managed-node2 22736 1727204258.26584: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204258.26587: Calling groups_plugins_play to load vars for managed-node2 22736 1727204258.28976: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204258.36318: done with get_vars() 22736 1727204258.36366: done getting variables 22736 1727204258.36463: done queuing things up, now waiting for results queue to drain 22736 1727204258.36466: results queue empty 22736 1727204258.36467: checking for any_errors_fatal 22736 1727204258.36470: done checking for any_errors_fatal 22736 1727204258.36471: checking for max_fail_percentage 22736 1727204258.36473: done checking for max_fail_percentage 22736 1727204258.36474: checking to see if all hosts have failed and the running result is not ok 22736 1727204258.36475: done checking to see if all hosts have failed 22736 1727204258.36476: getting the remaining hosts for this loop 22736 1727204258.36477: done getting the remaining hosts for this loop 22736 1727204258.36480: getting the next task for host managed-node2 22736 1727204258.36485: done getting next task for host managed-node2 22736 1727204258.36487: ^ task is: TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 22736 1727204258.36492: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204258.36494: getting variables 22736 1727204258.36496: in VariableManager get_vars() 22736 1727204258.36510: Calling all_inventory to load vars for managed-node2 22736 1727204258.36516: Calling groups_inventory to load vars for managed-node2 22736 1727204258.36519: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204258.36525: Calling all_plugins_play to load vars for managed-node2 22736 1727204258.36528: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204258.36532: Calling groups_plugins_play to load vars for managed-node2 22736 1727204258.38684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204258.42566: done with get_vars() 22736 1727204258.42619: done getting variables TASK [Include the task 'assert_output_in_stderr_without_warnings.yml'] ********* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:47 Tuesday 24 September 2024 14:57:38 -0400 (0:00:01.080) 0:00:23.212 ***** 22736 1727204258.42736: entering _queue_task() for managed-node2/include_tasks 22736 1727204258.43332: worker is 1 (out of 1 available) 22736 1727204258.43344: exiting _queue_task() for managed-node2/include_tasks 22736 1727204258.43355: done queuing things up, now waiting for results queue to drain 22736 1727204258.43357: waiting for pending results... 22736 1727204258.43675: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' 22736 1727204258.43772: in run() - task 12b410aa-8751-4f4a-548a-000000000030 22736 1727204258.43818: variable 'ansible_search_path' from source: unknown 22736 1727204258.43894: calling self._execute() 22736 1727204258.44038: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204258.44097: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204258.44101: variable 'omit' from source: magic vars 22736 1727204258.44568: variable 'ansible_distribution_major_version' from source: facts 22736 1727204258.44591: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204258.44605: _execute() done 22736 1727204258.44618: dumping result to json 22736 1727204258.44643: done dumping result, returning 22736 1727204258.44647: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_output_in_stderr_without_warnings.yml' [12b410aa-8751-4f4a-548a-000000000030] 22736 1727204258.44671: sending task result for task 12b410aa-8751-4f4a-548a-000000000030 22736 1727204258.44904: done sending task result for task 12b410aa-8751-4f4a-548a-000000000030 22736 1727204258.44907: WORKER PROCESS EXITING 22736 1727204258.44944: no more pending results, returning what we have 22736 1727204258.44950: in VariableManager get_vars() 22736 1727204258.45007: Calling all_inventory to load vars for managed-node2 22736 1727204258.45011: Calling groups_inventory to load vars for managed-node2 22736 1727204258.45016: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204258.45035: Calling all_plugins_play to load vars for managed-node2 22736 1727204258.45038: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204258.45042: Calling groups_plugins_play to load vars for managed-node2 22736 1727204258.47992: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204258.51280: done with get_vars() 22736 1727204258.51318: variable 'ansible_search_path' from source: unknown 22736 1727204258.51341: we have included files to process 22736 1727204258.51343: generating all_blocks data 22736 1727204258.51346: done generating all_blocks data 22736 1727204258.51353: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 22736 1727204258.51354: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 22736 1727204258.51358: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml 22736 1727204258.51859: done processing included file 22736 1727204258.51862: iterating over new_blocks loaded from include file 22736 1727204258.51864: in VariableManager get_vars() 22736 1727204258.51901: done with get_vars() 22736 1727204258.51903: filtering new block on tags 22736 1727204258.51926: done filtering new block on tags 22736 1727204258.51930: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml for managed-node2 22736 1727204258.51936: extending task lists for all hosts with included blocks 22736 1727204258.51976: done extending task lists 22736 1727204258.51978: done processing included files 22736 1727204258.51979: results queue empty 22736 1727204258.51980: checking for any_errors_fatal 22736 1727204258.51981: done checking for any_errors_fatal 22736 1727204258.51982: checking for max_fail_percentage 22736 1727204258.51984: done checking for max_fail_percentage 22736 1727204258.51985: checking to see if all hosts have failed and the running result is not ok 22736 1727204258.51986: done checking to see if all hosts have failed 22736 1727204258.51987: getting the remaining hosts for this loop 22736 1727204258.51993: done getting the remaining hosts for this loop 22736 1727204258.51997: getting the next task for host managed-node2 22736 1727204258.52001: done getting next task for host managed-node2 22736 1727204258.52003: ^ task is: TASK: Assert that warnings is empty 22736 1727204258.52006: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204258.52009: getting variables 22736 1727204258.52010: in VariableManager get_vars() 22736 1727204258.52023: Calling all_inventory to load vars for managed-node2 22736 1727204258.52026: Calling groups_inventory to load vars for managed-node2 22736 1727204258.52028: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204258.52042: Calling all_plugins_play to load vars for managed-node2 22736 1727204258.52046: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204258.52051: Calling groups_plugins_play to load vars for managed-node2 22736 1727204258.54225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204258.57732: done with get_vars() 22736 1727204258.57786: done getting variables 22736 1727204258.57881: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that warnings is empty] ******************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:3 Tuesday 24 September 2024 14:57:38 -0400 (0:00:00.151) 0:00:23.364 ***** 22736 1727204258.57933: entering _queue_task() for managed-node2/assert 22736 1727204258.58505: worker is 1 (out of 1 available) 22736 1727204258.58517: exiting _queue_task() for managed-node2/assert 22736 1727204258.58609: done queuing things up, now waiting for results queue to drain 22736 1727204258.58611: waiting for pending results... 22736 1727204258.59000: running TaskExecutor() for managed-node2/TASK: Assert that warnings is empty 22736 1727204258.59065: in run() - task 12b410aa-8751-4f4a-548a-000000000304 22736 1727204258.59111: variable 'ansible_search_path' from source: unknown 22736 1727204258.59136: variable 'ansible_search_path' from source: unknown 22736 1727204258.59205: calling self._execute() 22736 1727204258.59410: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204258.59416: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204258.59420: variable 'omit' from source: magic vars 22736 1727204258.59926: variable 'ansible_distribution_major_version' from source: facts 22736 1727204258.59951: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204258.59992: variable 'omit' from source: magic vars 22736 1727204258.60033: variable 'omit' from source: magic vars 22736 1727204258.60093: variable 'omit' from source: magic vars 22736 1727204258.60170: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204258.60239: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204258.60258: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204258.60278: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204258.60293: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204258.60325: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204258.60329: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204258.60334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204258.60425: Set connection var ansible_timeout to 10 22736 1727204258.60437: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204258.60446: Set connection var ansible_shell_executable to /bin/sh 22736 1727204258.60449: Set connection var ansible_shell_type to sh 22736 1727204258.60455: Set connection var ansible_pipelining to False 22736 1727204258.60457: Set connection var ansible_connection to ssh 22736 1727204258.60477: variable 'ansible_shell_executable' from source: unknown 22736 1727204258.60481: variable 'ansible_connection' from source: unknown 22736 1727204258.60484: variable 'ansible_module_compression' from source: unknown 22736 1727204258.60486: variable 'ansible_shell_type' from source: unknown 22736 1727204258.60497: variable 'ansible_shell_executable' from source: unknown 22736 1727204258.60500: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204258.60502: variable 'ansible_pipelining' from source: unknown 22736 1727204258.60505: variable 'ansible_timeout' from source: unknown 22736 1727204258.60510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204258.60641: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204258.60652: variable 'omit' from source: magic vars 22736 1727204258.60659: starting attempt loop 22736 1727204258.60662: running the handler 22736 1727204258.60780: variable '__network_connections_result' from source: set_fact 22736 1727204258.60793: Evaluated conditional ('warnings' not in __network_connections_result): True 22736 1727204258.60800: handler run complete 22736 1727204258.60817: attempt loop complete, returning result 22736 1727204258.60822: _execute() done 22736 1727204258.60824: dumping result to json 22736 1727204258.60827: done dumping result, returning 22736 1727204258.60836: done running TaskExecutor() for managed-node2/TASK: Assert that warnings is empty [12b410aa-8751-4f4a-548a-000000000304] 22736 1727204258.60838: sending task result for task 12b410aa-8751-4f4a-548a-000000000304 22736 1727204258.60933: done sending task result for task 12b410aa-8751-4f4a-548a-000000000304 22736 1727204258.60938: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 22736 1727204258.61006: no more pending results, returning what we have 22736 1727204258.61010: results queue empty 22736 1727204258.61011: checking for any_errors_fatal 22736 1727204258.61015: done checking for any_errors_fatal 22736 1727204258.61016: checking for max_fail_percentage 22736 1727204258.61018: done checking for max_fail_percentage 22736 1727204258.61019: checking to see if all hosts have failed and the running result is not ok 22736 1727204258.61020: done checking to see if all hosts have failed 22736 1727204258.61021: getting the remaining hosts for this loop 22736 1727204258.61023: done getting the remaining hosts for this loop 22736 1727204258.61027: getting the next task for host managed-node2 22736 1727204258.61034: done getting next task for host managed-node2 22736 1727204258.61037: ^ task is: TASK: Assert that there is output in stderr 22736 1727204258.61040: ^ state is: HOST STATE: block=4, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204258.61044: getting variables 22736 1727204258.61046: in VariableManager get_vars() 22736 1727204258.61127: Calling all_inventory to load vars for managed-node2 22736 1727204258.61130: Calling groups_inventory to load vars for managed-node2 22736 1727204258.61134: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204258.61145: Calling all_plugins_play to load vars for managed-node2 22736 1727204258.61148: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204258.61152: Calling groups_plugins_play to load vars for managed-node2 22736 1727204258.62790: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204258.65218: done with get_vars() 22736 1727204258.65245: done getting variables 22736 1727204258.65297: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Assert that there is output in stderr] *********************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_output_in_stderr_without_warnings.yml:8 Tuesday 24 September 2024 14:57:38 -0400 (0:00:00.073) 0:00:23.438 ***** 22736 1727204258.65322: entering _queue_task() for managed-node2/assert 22736 1727204258.65591: worker is 1 (out of 1 available) 22736 1727204258.65604: exiting _queue_task() for managed-node2/assert 22736 1727204258.65617: done queuing things up, now waiting for results queue to drain 22736 1727204258.65619: waiting for pending results... 22736 1727204258.65816: running TaskExecutor() for managed-node2/TASK: Assert that there is output in stderr 22736 1727204258.65913: in run() - task 12b410aa-8751-4f4a-548a-000000000305 22736 1727204258.65929: variable 'ansible_search_path' from source: unknown 22736 1727204258.65933: variable 'ansible_search_path' from source: unknown 22736 1727204258.65969: calling self._execute() 22736 1727204258.66050: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204258.66058: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204258.66071: variable 'omit' from source: magic vars 22736 1727204258.66412: variable 'ansible_distribution_major_version' from source: facts 22736 1727204258.66424: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204258.66431: variable 'omit' from source: magic vars 22736 1727204258.66468: variable 'omit' from source: magic vars 22736 1727204258.66502: variable 'omit' from source: magic vars 22736 1727204258.66541: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204258.66572: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204258.66590: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204258.66607: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204258.66624: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204258.66653: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204258.66656: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204258.66659: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204258.66749: Set connection var ansible_timeout to 10 22736 1727204258.66761: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204258.66769: Set connection var ansible_shell_executable to /bin/sh 22736 1727204258.66772: Set connection var ansible_shell_type to sh 22736 1727204258.66779: Set connection var ansible_pipelining to False 22736 1727204258.66782: Set connection var ansible_connection to ssh 22736 1727204258.66803: variable 'ansible_shell_executable' from source: unknown 22736 1727204258.66806: variable 'ansible_connection' from source: unknown 22736 1727204258.66809: variable 'ansible_module_compression' from source: unknown 22736 1727204258.66813: variable 'ansible_shell_type' from source: unknown 22736 1727204258.66820: variable 'ansible_shell_executable' from source: unknown 22736 1727204258.66822: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204258.66830: variable 'ansible_pipelining' from source: unknown 22736 1727204258.66833: variable 'ansible_timeout' from source: unknown 22736 1727204258.66838: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204258.66970: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204258.66980: variable 'omit' from source: magic vars 22736 1727204258.66985: starting attempt loop 22736 1727204258.66988: running the handler 22736 1727204258.67105: variable '__network_connections_result' from source: set_fact 22736 1727204258.67117: Evaluated conditional ('stderr' in __network_connections_result): True 22736 1727204258.67125: handler run complete 22736 1727204258.67139: attempt loop complete, returning result 22736 1727204258.67142: _execute() done 22736 1727204258.67146: dumping result to json 22736 1727204258.67151: done dumping result, returning 22736 1727204258.67158: done running TaskExecutor() for managed-node2/TASK: Assert that there is output in stderr [12b410aa-8751-4f4a-548a-000000000305] 22736 1727204258.67166: sending task result for task 12b410aa-8751-4f4a-548a-000000000305 22736 1727204258.67253: done sending task result for task 12b410aa-8751-4f4a-548a-000000000305 22736 1727204258.67256: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 22736 1727204258.67340: no more pending results, returning what we have 22736 1727204258.67344: results queue empty 22736 1727204258.67345: checking for any_errors_fatal 22736 1727204258.67353: done checking for any_errors_fatal 22736 1727204258.67353: checking for max_fail_percentage 22736 1727204258.67355: done checking for max_fail_percentage 22736 1727204258.67356: checking to see if all hosts have failed and the running result is not ok 22736 1727204258.67357: done checking to see if all hosts have failed 22736 1727204258.67358: getting the remaining hosts for this loop 22736 1727204258.67360: done getting the remaining hosts for this loop 22736 1727204258.67364: getting the next task for host managed-node2 22736 1727204258.67372: done getting next task for host managed-node2 22736 1727204258.67375: ^ task is: TASK: meta (flush_handlers) 22736 1727204258.67379: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204258.67383: getting variables 22736 1727204258.67385: in VariableManager get_vars() 22736 1727204258.67421: Calling all_inventory to load vars for managed-node2 22736 1727204258.67424: Calling groups_inventory to load vars for managed-node2 22736 1727204258.67427: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204258.67439: Calling all_plugins_play to load vars for managed-node2 22736 1727204258.67442: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204258.67445: Calling groups_plugins_play to load vars for managed-node2 22736 1727204258.68857: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204258.70438: done with get_vars() 22736 1727204258.70459: done getting variables 22736 1727204258.70517: in VariableManager get_vars() 22736 1727204258.70527: Calling all_inventory to load vars for managed-node2 22736 1727204258.70529: Calling groups_inventory to load vars for managed-node2 22736 1727204258.70530: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204258.70534: Calling all_plugins_play to load vars for managed-node2 22736 1727204258.70537: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204258.70541: Calling groups_plugins_play to load vars for managed-node2 22736 1727204258.72661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204258.74418: done with get_vars() 22736 1727204258.74442: done queuing things up, now waiting for results queue to drain 22736 1727204258.74444: results queue empty 22736 1727204258.74444: checking for any_errors_fatal 22736 1727204258.74447: done checking for any_errors_fatal 22736 1727204258.74447: checking for max_fail_percentage 22736 1727204258.74448: done checking for max_fail_percentage 22736 1727204258.74449: checking to see if all hosts have failed and the running result is not ok 22736 1727204258.74449: done checking to see if all hosts have failed 22736 1727204258.74450: getting the remaining hosts for this loop 22736 1727204258.74455: done getting the remaining hosts for this loop 22736 1727204258.74457: getting the next task for host managed-node2 22736 1727204258.74461: done getting next task for host managed-node2 22736 1727204258.74462: ^ task is: TASK: meta (flush_handlers) 22736 1727204258.74463: ^ state is: HOST STATE: block=6, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204258.74465: getting variables 22736 1727204258.74466: in VariableManager get_vars() 22736 1727204258.74477: Calling all_inventory to load vars for managed-node2 22736 1727204258.74479: Calling groups_inventory to load vars for managed-node2 22736 1727204258.74481: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204258.74485: Calling all_plugins_play to load vars for managed-node2 22736 1727204258.74487: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204258.74491: Calling groups_plugins_play to load vars for managed-node2 22736 1727204258.76401: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204258.78284: done with get_vars() 22736 1727204258.78314: done getting variables 22736 1727204258.78362: in VariableManager get_vars() 22736 1727204258.78373: Calling all_inventory to load vars for managed-node2 22736 1727204258.78375: Calling groups_inventory to load vars for managed-node2 22736 1727204258.78376: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204258.78381: Calling all_plugins_play to load vars for managed-node2 22736 1727204258.78382: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204258.78385: Calling groups_plugins_play to load vars for managed-node2 22736 1727204258.79551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204258.81638: done with get_vars() 22736 1727204258.81669: done queuing things up, now waiting for results queue to drain 22736 1727204258.81672: results queue empty 22736 1727204258.81673: checking for any_errors_fatal 22736 1727204258.81675: done checking for any_errors_fatal 22736 1727204258.81675: checking for max_fail_percentage 22736 1727204258.81676: done checking for max_fail_percentage 22736 1727204258.81677: checking to see if all hosts have failed and the running result is not ok 22736 1727204258.81677: done checking to see if all hosts have failed 22736 1727204258.81678: getting the remaining hosts for this loop 22736 1727204258.81679: done getting the remaining hosts for this loop 22736 1727204258.81681: getting the next task for host managed-node2 22736 1727204258.81684: done getting next task for host managed-node2 22736 1727204258.81685: ^ task is: None 22736 1727204258.81686: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204258.81687: done queuing things up, now waiting for results queue to drain 22736 1727204258.81687: results queue empty 22736 1727204258.81688: checking for any_errors_fatal 22736 1727204258.81690: done checking for any_errors_fatal 22736 1727204258.81691: checking for max_fail_percentage 22736 1727204258.81692: done checking for max_fail_percentage 22736 1727204258.81692: checking to see if all hosts have failed and the running result is not ok 22736 1727204258.81693: done checking to see if all hosts have failed 22736 1727204258.81694: getting the next task for host managed-node2 22736 1727204258.81696: done getting next task for host managed-node2 22736 1727204258.81696: ^ task is: None 22736 1727204258.81697: ^ state is: HOST STATE: block=7, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204258.81737: in VariableManager get_vars() 22736 1727204258.81751: done with get_vars() 22736 1727204258.81756: in VariableManager get_vars() 22736 1727204258.81762: done with get_vars() 22736 1727204258.81766: variable 'omit' from source: magic vars 22736 1727204258.81797: in VariableManager get_vars() 22736 1727204258.81806: done with get_vars() 22736 1727204258.81824: variable 'omit' from source: magic vars PLAY [Play for cleaning up the test device and the connection profile] ********* 22736 1727204258.81971: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22736 1727204258.81994: getting the remaining hosts for this loop 22736 1727204258.81996: done getting the remaining hosts for this loop 22736 1727204258.81999: getting the next task for host managed-node2 22736 1727204258.82002: done getting next task for host managed-node2 22736 1727204258.82004: ^ task is: TASK: Gathering Facts 22736 1727204258.82006: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204258.82008: getting variables 22736 1727204258.82008: in VariableManager get_vars() 22736 1727204258.82016: Calling all_inventory to load vars for managed-node2 22736 1727204258.82018: Calling groups_inventory to load vars for managed-node2 22736 1727204258.82020: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204258.82025: Calling all_plugins_play to load vars for managed-node2 22736 1727204258.82027: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204258.82029: Calling groups_plugins_play to load vars for managed-node2 22736 1727204258.83134: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204258.84732: done with get_vars() 22736 1727204258.84752: done getting variables 22736 1727204258.84788: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 Tuesday 24 September 2024 14:57:38 -0400 (0:00:00.194) 0:00:23.633 ***** 22736 1727204258.84810: entering _queue_task() for managed-node2/gather_facts 22736 1727204258.85069: worker is 1 (out of 1 available) 22736 1727204258.85083: exiting _queue_task() for managed-node2/gather_facts 22736 1727204258.85100: done queuing things up, now waiting for results queue to drain 22736 1727204258.85101: waiting for pending results... 22736 1727204258.85300: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22736 1727204258.85374: in run() - task 12b410aa-8751-4f4a-548a-000000000316 22736 1727204258.85387: variable 'ansible_search_path' from source: unknown 22736 1727204258.85423: calling self._execute() 22736 1727204258.85499: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204258.85506: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204258.85519: variable 'omit' from source: magic vars 22736 1727204258.85849: variable 'ansible_distribution_major_version' from source: facts 22736 1727204258.85861: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204258.85869: variable 'omit' from source: magic vars 22736 1727204258.85897: variable 'omit' from source: magic vars 22736 1727204258.85930: variable 'omit' from source: magic vars 22736 1727204258.85964: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204258.86000: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204258.86021: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204258.86038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204258.86049: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204258.86076: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204258.86080: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204258.86083: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204258.86172: Set connection var ansible_timeout to 10 22736 1727204258.86183: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204258.86192: Set connection var ansible_shell_executable to /bin/sh 22736 1727204258.86196: Set connection var ansible_shell_type to sh 22736 1727204258.86206: Set connection var ansible_pipelining to False 22736 1727204258.86209: Set connection var ansible_connection to ssh 22736 1727204258.86232: variable 'ansible_shell_executable' from source: unknown 22736 1727204258.86236: variable 'ansible_connection' from source: unknown 22736 1727204258.86239: variable 'ansible_module_compression' from source: unknown 22736 1727204258.86242: variable 'ansible_shell_type' from source: unknown 22736 1727204258.86244: variable 'ansible_shell_executable' from source: unknown 22736 1727204258.86250: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204258.86255: variable 'ansible_pipelining' from source: unknown 22736 1727204258.86257: variable 'ansible_timeout' from source: unknown 22736 1727204258.86263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204258.86426: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204258.86436: variable 'omit' from source: magic vars 22736 1727204258.86447: starting attempt loop 22736 1727204258.86450: running the handler 22736 1727204258.86464: variable 'ansible_facts' from source: unknown 22736 1727204258.86482: _low_level_execute_command(): starting 22736 1727204258.86491: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204258.87052: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204258.87056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204258.87060: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204258.87062: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204258.87121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204258.87125: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204258.87183: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204258.88999: stdout chunk (state=3): >>>/root <<< 22736 1727204258.89105: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204258.89168: stderr chunk (state=3): >>><<< 22736 1727204258.89173: stdout chunk (state=3): >>><<< 22736 1727204258.89202: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204258.89217: _low_level_execute_command(): starting 22736 1727204258.89220: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204258.892007-23828-132685957405117 `" && echo ansible-tmp-1727204258.892007-23828-132685957405117="` echo /root/.ansible/tmp/ansible-tmp-1727204258.892007-23828-132685957405117 `" ) && sleep 0' 22736 1727204258.89696: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204258.89732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204258.89736: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204258.89746: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204258.89798: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204258.89803: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204258.89847: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204258.91953: stdout chunk (state=3): >>>ansible-tmp-1727204258.892007-23828-132685957405117=/root/.ansible/tmp/ansible-tmp-1727204258.892007-23828-132685957405117 <<< 22736 1727204258.92078: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204258.92134: stderr chunk (state=3): >>><<< 22736 1727204258.92137: stdout chunk (state=3): >>><<< 22736 1727204258.92154: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204258.892007-23828-132685957405117=/root/.ansible/tmp/ansible-tmp-1727204258.892007-23828-132685957405117 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204258.92185: variable 'ansible_module_compression' from source: unknown 22736 1727204258.92233: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22736 1727204258.92294: variable 'ansible_facts' from source: unknown 22736 1727204258.92405: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204258.892007-23828-132685957405117/AnsiballZ_setup.py 22736 1727204258.92528: Sending initial data 22736 1727204258.92532: Sent initial data (153 bytes) 22736 1727204258.93007: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204258.93011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204258.93016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204258.93018: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204258.93072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204258.93075: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204258.93120: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204258.94839: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" <<< 22736 1727204258.94849: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204258.94883: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204258.94923: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpuft4hqyw /root/.ansible/tmp/ansible-tmp-1727204258.892007-23828-132685957405117/AnsiballZ_setup.py <<< 22736 1727204258.94931: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204258.892007-23828-132685957405117/AnsiballZ_setup.py" <<< 22736 1727204258.94963: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpuft4hqyw" to remote "/root/.ansible/tmp/ansible-tmp-1727204258.892007-23828-132685957405117/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204258.892007-23828-132685957405117/AnsiballZ_setup.py" <<< 22736 1727204258.96638: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204258.96704: stderr chunk (state=3): >>><<< 22736 1727204258.96707: stdout chunk (state=3): >>><<< 22736 1727204258.96733: done transferring module to remote 22736 1727204258.96745: _low_level_execute_command(): starting 22736 1727204258.96751: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204258.892007-23828-132685957405117/ /root/.ansible/tmp/ansible-tmp-1727204258.892007-23828-132685957405117/AnsiballZ_setup.py && sleep 0' 22736 1727204258.97219: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204258.97223: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204258.97225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204258.97227: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204258.97230: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204258.97291: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204258.97296: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204258.97330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204258.99299: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204258.99351: stderr chunk (state=3): >>><<< 22736 1727204258.99354: stdout chunk (state=3): >>><<< 22736 1727204258.99370: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204258.99373: _low_level_execute_command(): starting 22736 1727204258.99379: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204258.892007-23828-132685957405117/AnsiballZ_setup.py && sleep 0' 22736 1727204258.99846: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204258.99850: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204258.99852: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204258.99855: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204258.99905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204258.99913: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204258.99981: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204259.71247: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "X<<< 22736 1727204259.71258: stdout chunk (state=3): >>>DG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2833, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 884, "free": 2833}, "nocache": {"free": 3463, "used": 254}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 763, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146932224, "block_size": 4096, "block_total": 64479564, "block_available": 61315169, "block_used": 3164395, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_service_mgr": "systemd", "ansible_interfaces": ["peerlsr27", "lo", "eth0", "lsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, <<< 22736 1727204259.71367: stdout chunk (state=3): >>>"ansible_lsr27": {"device": "lsr27", "macaddress": "1a:7c:88:07:95:bb", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::7aca:905f:e6:3e16", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "02:23:6d:58:c2:b5", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::23:6dff:fe58:c2b5", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8", "fe80::7aca:905f:e6:3e16", "fe80::23:6dff:fe58:c2b5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::23:6dff:fe58:c2b5", "fe80::4a44:1e77:128f:34e8", "fe80::7aca:905f:e6:3e16"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.95263671875, "5m": 0.68212890625, "15m": 0.4140625}, "ansible_local": {}, "ansible_hostnqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "39", "epoch": "1727204259", "epoch_int": "1727204259", "date": "2024-09-24", "time": "14:57:39", "iso8601_micro": "2024-09-24T18:57:39.708421Z", "iso8601": "2024-09-24T18:57:39Z", "iso8601_basic": "20240924T145739708421", "iso8601_basic_short": "20240924T145739", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22736 1727204259.73597: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204259.73604: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 22736 1727204259.73798: stderr chunk (state=3): >>><<< 22736 1727204259.73802: stdout chunk (state=3): >>><<< 22736 1727204259.73904: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_lsb": {}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_is_chroot": false, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2833, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 884, "free": 2833}, "nocache": {"free": 3463, "used": 254}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 763, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146932224, "block_size": 4096, "block_total": 64479564, "block_available": 61315169, "block_used": 3164395, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_service_mgr": "systemd", "ansible_interfaces": ["peerlsr27", "lo", "eth0", "lsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "1a:7c:88:07:95:bb", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::7aca:905f:e6:3e16", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "02:23:6d:58:c2:b5", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::23:6dff:fe58:c2b5", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8", "fe80::7aca:905f:e6:3e16", "fe80::23:6dff:fe58:c2b5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::23:6dff:fe58:c2b5", "fe80::4a44:1e77:128f:34e8", "fe80::7aca:905f:e6:3e16"]}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.95263671875, "5m": 0.68212890625, "15m": 0.4140625}, "ansible_local": {}, "ansible_hostnqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "39", "epoch": "1727204259", "epoch_int": "1727204259", "date": "2024-09-24", "time": "14:57:39", "iso8601_micro": "2024-09-24T18:57:39.708421Z", "iso8601": "2024-09-24T18:57:39Z", "iso8601_basic": "20240924T145739708421", "iso8601_basic_short": "20240924T145739", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fibre_channel_wwn": [], "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204259.74872: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204258.892007-23828-132685957405117/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204259.74876: _low_level_execute_command(): starting 22736 1727204259.74879: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204258.892007-23828-132685957405117/ > /dev/null 2>&1 && sleep 0' 22736 1727204259.75557: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204259.75574: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204259.75666: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204259.75722: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204259.75741: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204259.75778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204259.75852: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204259.78018: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204259.78057: stdout chunk (state=3): >>><<< 22736 1727204259.78103: stderr chunk (state=3): >>><<< 22736 1727204259.78320: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204259.78328: handler run complete 22736 1727204259.78945: variable 'ansible_facts' from source: unknown 22736 1727204259.79300: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204259.80322: variable 'ansible_facts' from source: unknown 22736 1727204259.80537: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204259.80820: attempt loop complete, returning result 22736 1727204259.80830: _execute() done 22736 1727204259.80844: dumping result to json 22736 1727204259.80891: done dumping result, returning 22736 1727204259.80991: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-4f4a-548a-000000000316] 22736 1727204259.80994: sending task result for task 12b410aa-8751-4f4a-548a-000000000316 ok: [managed-node2] 22736 1727204259.82245: done sending task result for task 12b410aa-8751-4f4a-548a-000000000316 22736 1727204259.82248: WORKER PROCESS EXITING 22736 1727204259.82524: no more pending results, returning what we have 22736 1727204259.82528: results queue empty 22736 1727204259.82529: checking for any_errors_fatal 22736 1727204259.82531: done checking for any_errors_fatal 22736 1727204259.82532: checking for max_fail_percentage 22736 1727204259.82533: done checking for max_fail_percentage 22736 1727204259.82534: checking to see if all hosts have failed and the running result is not ok 22736 1727204259.82535: done checking to see if all hosts have failed 22736 1727204259.82536: getting the remaining hosts for this loop 22736 1727204259.82538: done getting the remaining hosts for this loop 22736 1727204259.82542: getting the next task for host managed-node2 22736 1727204259.82554: done getting next task for host managed-node2 22736 1727204259.82557: ^ task is: TASK: meta (flush_handlers) 22736 1727204259.82559: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204259.82563: getting variables 22736 1727204259.82565: in VariableManager get_vars() 22736 1727204259.82596: Calling all_inventory to load vars for managed-node2 22736 1727204259.82600: Calling groups_inventory to load vars for managed-node2 22736 1727204259.82604: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204259.82616: Calling all_plugins_play to load vars for managed-node2 22736 1727204259.82619: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204259.82623: Calling groups_plugins_play to load vars for managed-node2 22736 1727204259.85075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204259.87062: done with get_vars() 22736 1727204259.87099: done getting variables 22736 1727204259.87166: in VariableManager get_vars() 22736 1727204259.87183: Calling all_inventory to load vars for managed-node2 22736 1727204259.87186: Calling groups_inventory to load vars for managed-node2 22736 1727204259.87192: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204259.87197: Calling all_plugins_play to load vars for managed-node2 22736 1727204259.87199: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204259.87202: Calling groups_plugins_play to load vars for managed-node2 22736 1727204259.89183: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204259.91758: done with get_vars() 22736 1727204259.91797: done queuing things up, now waiting for results queue to drain 22736 1727204259.91799: results queue empty 22736 1727204259.91800: checking for any_errors_fatal 22736 1727204259.91804: done checking for any_errors_fatal 22736 1727204259.91804: checking for max_fail_percentage 22736 1727204259.91805: done checking for max_fail_percentage 22736 1727204259.91806: checking to see if all hosts have failed and the running result is not ok 22736 1727204259.91811: done checking to see if all hosts have failed 22736 1727204259.91811: getting the remaining hosts for this loop 22736 1727204259.91814: done getting the remaining hosts for this loop 22736 1727204259.91817: getting the next task for host managed-node2 22736 1727204259.91820: done getting next task for host managed-node2 22736 1727204259.91822: ^ task is: TASK: Show network_provider 22736 1727204259.91823: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204259.91826: getting variables 22736 1727204259.91827: in VariableManager get_vars() 22736 1727204259.91836: Calling all_inventory to load vars for managed-node2 22736 1727204259.91838: Calling groups_inventory to load vars for managed-node2 22736 1727204259.91840: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204259.91846: Calling all_plugins_play to load vars for managed-node2 22736 1727204259.91848: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204259.91850: Calling groups_plugins_play to load vars for managed-node2 22736 1727204259.93049: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204259.95401: done with get_vars() 22736 1727204259.95440: done getting variables 22736 1727204259.95507: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Show network_provider] *************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:53 Tuesday 24 September 2024 14:57:39 -0400 (0:00:01.107) 0:00:24.740 ***** 22736 1727204259.95549: entering _queue_task() for managed-node2/debug 22736 1727204259.96018: worker is 1 (out of 1 available) 22736 1727204259.96035: exiting _queue_task() for managed-node2/debug 22736 1727204259.96048: done queuing things up, now waiting for results queue to drain 22736 1727204259.96050: waiting for pending results... 22736 1727204259.96303: running TaskExecutor() for managed-node2/TASK: Show network_provider 22736 1727204259.96431: in run() - task 12b410aa-8751-4f4a-548a-000000000033 22736 1727204259.96436: variable 'ansible_search_path' from source: unknown 22736 1727204259.96479: calling self._execute() 22736 1727204259.96625: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204259.96629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204259.96633: variable 'omit' from source: magic vars 22736 1727204259.96984: variable 'ansible_distribution_major_version' from source: facts 22736 1727204259.96997: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204259.97004: variable 'omit' from source: magic vars 22736 1727204259.97033: variable 'omit' from source: magic vars 22736 1727204259.97068: variable 'omit' from source: magic vars 22736 1727204259.97108: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204259.97155: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204259.97172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204259.97191: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204259.97206: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204259.97236: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204259.97239: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204259.97242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204259.97359: Set connection var ansible_timeout to 10 22736 1727204259.97370: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204259.97379: Set connection var ansible_shell_executable to /bin/sh 22736 1727204259.97382: Set connection var ansible_shell_type to sh 22736 1727204259.97396: Set connection var ansible_pipelining to False 22736 1727204259.97399: Set connection var ansible_connection to ssh 22736 1727204259.97424: variable 'ansible_shell_executable' from source: unknown 22736 1727204259.97427: variable 'ansible_connection' from source: unknown 22736 1727204259.97430: variable 'ansible_module_compression' from source: unknown 22736 1727204259.97433: variable 'ansible_shell_type' from source: unknown 22736 1727204259.97436: variable 'ansible_shell_executable' from source: unknown 22736 1727204259.97441: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204259.97450: variable 'ansible_pipelining' from source: unknown 22736 1727204259.97452: variable 'ansible_timeout' from source: unknown 22736 1727204259.97455: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204259.97656: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204259.97660: variable 'omit' from source: magic vars 22736 1727204259.97663: starting attempt loop 22736 1727204259.97665: running the handler 22736 1727204259.97740: variable 'network_provider' from source: set_fact 22736 1727204259.97788: variable 'network_provider' from source: set_fact 22736 1727204259.97801: handler run complete 22736 1727204259.97819: attempt loop complete, returning result 22736 1727204259.97823: _execute() done 22736 1727204259.97825: dumping result to json 22736 1727204259.97831: done dumping result, returning 22736 1727204259.97838: done running TaskExecutor() for managed-node2/TASK: Show network_provider [12b410aa-8751-4f4a-548a-000000000033] 22736 1727204259.97842: sending task result for task 12b410aa-8751-4f4a-548a-000000000033 22736 1727204259.97936: done sending task result for task 12b410aa-8751-4f4a-548a-000000000033 22736 1727204259.97939: WORKER PROCESS EXITING ok: [managed-node2] => { "network_provider": "nm" } 22736 1727204259.98004: no more pending results, returning what we have 22736 1727204259.98007: results queue empty 22736 1727204259.98009: checking for any_errors_fatal 22736 1727204259.98013: done checking for any_errors_fatal 22736 1727204259.98014: checking for max_fail_percentage 22736 1727204259.98016: done checking for max_fail_percentage 22736 1727204259.98017: checking to see if all hosts have failed and the running result is not ok 22736 1727204259.98018: done checking to see if all hosts have failed 22736 1727204259.98019: getting the remaining hosts for this loop 22736 1727204259.98021: done getting the remaining hosts for this loop 22736 1727204259.98025: getting the next task for host managed-node2 22736 1727204259.98033: done getting next task for host managed-node2 22736 1727204259.98035: ^ task is: TASK: meta (flush_handlers) 22736 1727204259.98038: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204259.98043: getting variables 22736 1727204259.98045: in VariableManager get_vars() 22736 1727204259.98075: Calling all_inventory to load vars for managed-node2 22736 1727204259.98078: Calling groups_inventory to load vars for managed-node2 22736 1727204259.98082: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204259.98096: Calling all_plugins_play to load vars for managed-node2 22736 1727204259.98099: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204259.98103: Calling groups_plugins_play to load vars for managed-node2 22736 1727204259.99482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204260.01166: done with get_vars() 22736 1727204260.01196: done getting variables 22736 1727204260.01255: in VariableManager get_vars() 22736 1727204260.01264: Calling all_inventory to load vars for managed-node2 22736 1727204260.01266: Calling groups_inventory to load vars for managed-node2 22736 1727204260.01268: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204260.01274: Calling all_plugins_play to load vars for managed-node2 22736 1727204260.01277: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204260.01279: Calling groups_plugins_play to load vars for managed-node2 22736 1727204260.02642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204260.04501: done with get_vars() 22736 1727204260.04535: done queuing things up, now waiting for results queue to drain 22736 1727204260.04537: results queue empty 22736 1727204260.04538: checking for any_errors_fatal 22736 1727204260.04540: done checking for any_errors_fatal 22736 1727204260.04541: checking for max_fail_percentage 22736 1727204260.04542: done checking for max_fail_percentage 22736 1727204260.04543: checking to see if all hosts have failed and the running result is not ok 22736 1727204260.04543: done checking to see if all hosts have failed 22736 1727204260.04544: getting the remaining hosts for this loop 22736 1727204260.04545: done getting the remaining hosts for this loop 22736 1727204260.04547: getting the next task for host managed-node2 22736 1727204260.04556: done getting next task for host managed-node2 22736 1727204260.04557: ^ task is: TASK: meta (flush_handlers) 22736 1727204260.04558: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204260.04560: getting variables 22736 1727204260.04561: in VariableManager get_vars() 22736 1727204260.04569: Calling all_inventory to load vars for managed-node2 22736 1727204260.04570: Calling groups_inventory to load vars for managed-node2 22736 1727204260.04572: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204260.04578: Calling all_plugins_play to load vars for managed-node2 22736 1727204260.04580: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204260.04584: Calling groups_plugins_play to load vars for managed-node2 22736 1727204260.05917: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204260.07830: done with get_vars() 22736 1727204260.07858: done getting variables 22736 1727204260.07910: in VariableManager get_vars() 22736 1727204260.07919: Calling all_inventory to load vars for managed-node2 22736 1727204260.07921: Calling groups_inventory to load vars for managed-node2 22736 1727204260.07928: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204260.07936: Calling all_plugins_play to load vars for managed-node2 22736 1727204260.07940: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204260.07945: Calling groups_plugins_play to load vars for managed-node2 22736 1727204260.09101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204260.10823: done with get_vars() 22736 1727204260.10859: done queuing things up, now waiting for results queue to drain 22736 1727204260.10861: results queue empty 22736 1727204260.10862: checking for any_errors_fatal 22736 1727204260.10863: done checking for any_errors_fatal 22736 1727204260.10863: checking for max_fail_percentage 22736 1727204260.10864: done checking for max_fail_percentage 22736 1727204260.10865: checking to see if all hosts have failed and the running result is not ok 22736 1727204260.10866: done checking to see if all hosts have failed 22736 1727204260.10866: getting the remaining hosts for this loop 22736 1727204260.10867: done getting the remaining hosts for this loop 22736 1727204260.10869: getting the next task for host managed-node2 22736 1727204260.10872: done getting next task for host managed-node2 22736 1727204260.10873: ^ task is: None 22736 1727204260.10874: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204260.10875: done queuing things up, now waiting for results queue to drain 22736 1727204260.10876: results queue empty 22736 1727204260.10876: checking for any_errors_fatal 22736 1727204260.10877: done checking for any_errors_fatal 22736 1727204260.10877: checking for max_fail_percentage 22736 1727204260.10878: done checking for max_fail_percentage 22736 1727204260.10879: checking to see if all hosts have failed and the running result is not ok 22736 1727204260.10879: done checking to see if all hosts have failed 22736 1727204260.10880: getting the next task for host managed-node2 22736 1727204260.10882: done getting next task for host managed-node2 22736 1727204260.10883: ^ task is: None 22736 1727204260.10884: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204260.10921: in VariableManager get_vars() 22736 1727204260.10941: done with get_vars() 22736 1727204260.10946: in VariableManager get_vars() 22736 1727204260.10958: done with get_vars() 22736 1727204260.10962: variable 'omit' from source: magic vars 22736 1727204260.11066: variable 'profile' from source: play vars 22736 1727204260.11169: in VariableManager get_vars() 22736 1727204260.11181: done with get_vars() 22736 1727204260.11201: variable 'omit' from source: magic vars 22736 1727204260.11255: variable 'profile' from source: play vars PLAY [Set down {{ profile }}] ************************************************** 22736 1727204260.12154: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22736 1727204260.12177: getting the remaining hosts for this loop 22736 1727204260.12178: done getting the remaining hosts for this loop 22736 1727204260.12180: getting the next task for host managed-node2 22736 1727204260.12183: done getting next task for host managed-node2 22736 1727204260.12185: ^ task is: TASK: Gathering Facts 22736 1727204260.12187: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204260.12190: getting variables 22736 1727204260.12191: in VariableManager get_vars() 22736 1727204260.12204: Calling all_inventory to load vars for managed-node2 22736 1727204260.12207: Calling groups_inventory to load vars for managed-node2 22736 1727204260.12209: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204260.12217: Calling all_plugins_play to load vars for managed-node2 22736 1727204260.12219: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204260.12221: Calling groups_plugins_play to load vars for managed-node2 22736 1727204260.18515: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204260.20074: done with get_vars() 22736 1727204260.20101: done getting variables 22736 1727204260.20143: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 Tuesday 24 September 2024 14:57:40 -0400 (0:00:00.246) 0:00:24.986 ***** 22736 1727204260.20164: entering _queue_task() for managed-node2/gather_facts 22736 1727204260.20441: worker is 1 (out of 1 available) 22736 1727204260.20456: exiting _queue_task() for managed-node2/gather_facts 22736 1727204260.20471: done queuing things up, now waiting for results queue to drain 22736 1727204260.20473: waiting for pending results... 22736 1727204260.20743: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22736 1727204260.20995: in run() - task 12b410aa-8751-4f4a-548a-00000000032b 22736 1727204260.21001: variable 'ansible_search_path' from source: unknown 22736 1727204260.21004: calling self._execute() 22736 1727204260.21007: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204260.21011: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204260.21018: variable 'omit' from source: magic vars 22736 1727204260.21460: variable 'ansible_distribution_major_version' from source: facts 22736 1727204260.21479: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204260.21494: variable 'omit' from source: magic vars 22736 1727204260.21541: variable 'omit' from source: magic vars 22736 1727204260.21594: variable 'omit' from source: magic vars 22736 1727204260.21646: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204260.21695: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204260.21725: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204260.21749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204260.21765: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204260.21808: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204260.21822: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204260.21832: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204260.21967: Set connection var ansible_timeout to 10 22736 1727204260.21988: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204260.22005: Set connection var ansible_shell_executable to /bin/sh 22736 1727204260.22011: Set connection var ansible_shell_type to sh 22736 1727204260.22024: Set connection var ansible_pipelining to False 22736 1727204260.22030: Set connection var ansible_connection to ssh 22736 1727204260.22056: variable 'ansible_shell_executable' from source: unknown 22736 1727204260.22064: variable 'ansible_connection' from source: unknown 22736 1727204260.22069: variable 'ansible_module_compression' from source: unknown 22736 1727204260.22076: variable 'ansible_shell_type' from source: unknown 22736 1727204260.22082: variable 'ansible_shell_executable' from source: unknown 22736 1727204260.22091: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204260.22100: variable 'ansible_pipelining' from source: unknown 22736 1727204260.22111: variable 'ansible_timeout' from source: unknown 22736 1727204260.22124: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204260.22333: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204260.22356: variable 'omit' from source: magic vars 22736 1727204260.22367: starting attempt loop 22736 1727204260.22375: running the handler 22736 1727204260.22400: variable 'ansible_facts' from source: unknown 22736 1727204260.22429: _low_level_execute_command(): starting 22736 1727204260.22442: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204260.23207: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204260.23229: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204260.23244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204260.23262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204260.23277: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204260.23288: stderr chunk (state=3): >>>debug2: match not found <<< 22736 1727204260.23306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204260.23412: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204260.23448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204260.23526: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204260.25344: stdout chunk (state=3): >>>/root <<< 22736 1727204260.25554: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204260.25568: stdout chunk (state=3): >>><<< 22736 1727204260.25627: stderr chunk (state=3): >>><<< 22736 1727204260.25660: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204260.25692: _low_level_execute_command(): starting 22736 1727204260.25793: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204260.2566767-23870-143522146365253 `" && echo ansible-tmp-1727204260.2566767-23870-143522146365253="` echo /root/.ansible/tmp/ansible-tmp-1727204260.2566767-23870-143522146365253 `" ) && sleep 0' 22736 1727204260.26494: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204260.26512: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204260.26656: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204260.26703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204260.26724: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204260.26739: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204260.26820: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204260.28920: stdout chunk (state=3): >>>ansible-tmp-1727204260.2566767-23870-143522146365253=/root/.ansible/tmp/ansible-tmp-1727204260.2566767-23870-143522146365253 <<< 22736 1727204260.29084: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204260.29088: stdout chunk (state=3): >>><<< 22736 1727204260.29098: stderr chunk (state=3): >>><<< 22736 1727204260.29115: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204260.2566767-23870-143522146365253=/root/.ansible/tmp/ansible-tmp-1727204260.2566767-23870-143522146365253 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204260.29154: variable 'ansible_module_compression' from source: unknown 22736 1727204260.29197: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22736 1727204260.29287: variable 'ansible_facts' from source: unknown 22736 1727204260.29444: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204260.2566767-23870-143522146365253/AnsiballZ_setup.py 22736 1727204260.29876: Sending initial data 22736 1727204260.29880: Sent initial data (154 bytes) 22736 1727204260.30482: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204260.30750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204260.30939: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204260.30982: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204260.32736: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22736 1727204260.32760: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204260.32823: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204260.32899: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp66748m_x /root/.ansible/tmp/ansible-tmp-1727204260.2566767-23870-143522146365253/AnsiballZ_setup.py <<< 22736 1727204260.32938: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204260.2566767-23870-143522146365253/AnsiballZ_setup.py" debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp66748m_x" to remote "/root/.ansible/tmp/ansible-tmp-1727204260.2566767-23870-143522146365253/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204260.2566767-23870-143522146365253/AnsiballZ_setup.py" <<< 22736 1727204260.35044: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204260.35073: stderr chunk (state=3): >>><<< 22736 1727204260.35088: stdout chunk (state=3): >>><<< 22736 1727204260.35130: done transferring module to remote 22736 1727204260.35149: _low_level_execute_command(): starting 22736 1727204260.35161: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204260.2566767-23870-143522146365253/ /root/.ansible/tmp/ansible-tmp-1727204260.2566767-23870-143522146365253/AnsiballZ_setup.py && sleep 0' 22736 1727204260.35780: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204260.35799: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204260.35820: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204260.35840: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204260.35859: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204260.35872: stderr chunk (state=3): >>>debug2: match not found <<< 22736 1727204260.35887: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204260.35917: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22736 1727204260.35934: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 22736 1727204260.36005: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204260.36047: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204260.36065: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204260.36091: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204260.36162: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204260.38195: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204260.38202: stdout chunk (state=3): >>><<< 22736 1727204260.38216: stderr chunk (state=3): >>><<< 22736 1727204260.38249: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204260.38260: _low_level_execute_command(): starting 22736 1727204260.38272: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204260.2566767-23870-143522146365253/AnsiballZ_setup.py && sleep 0' 22736 1727204260.38958: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204260.38973: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204260.38993: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204260.39016: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204260.39035: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204260.39069: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204260.39179: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204260.39225: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204260.39274: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204261.09064: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.95263671875, "5m": 0.68212890625, "15m": 0.4140625}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "40", "epoch": "1727204260", "epoch_int": "1727204260", "date": "2024-09-24", "time": "14:57:40", "iso8601_micro": "2024-09-24T18:57:40.713592Z", "iso8601": "2024-09-24T18:57:40Z", "iso8601_basic": "20240924T145740713592", "iso8601_basic_short": "20240924T145740", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline":<<< 22736 1727204261.09101: stdout chunk (state=3): >>> {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2848, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 869, "free": 2848}, "nocache": {"free": 3478, "used": 239}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 764, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146924032, "block_size": 4096, "block_total": 64479564, "block_available": 61315167, "block_used": 3164397, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0", "peerlsr27", "lsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "1a:7c:88:07:95:bb", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::7aca:905f:e6:3e16", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "02:23:6d:58:c2:b5", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::23:6dff:fe58:c2b5", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8", "fe80::7aca:905f:e6:3e16", "fe80::23:6dff:fe58:c2b5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::23:6dff:fe58:c2b5", "fe80::4a44:1e77:128f:34e8", "fe80::7aca:905f:e6:3e16"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22736 1727204261.11320: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204261.11334: stdout chunk (state=3): >>><<< 22736 1727204261.11348: stderr chunk (state=3): >>><<< 22736 1727204261.11393: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_iscsi_iqn": "", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_loadavg": {"1m": 0.95263671875, "5m": 0.68212890625, "15m": 0.4140625}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_is_chroot": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "40", "epoch": "1727204260", "epoch_int": "1727204260", "date": "2024-09-24", "time": "14:57:40", "iso8601_micro": "2024-09-24T18:57:40.713592Z", "iso8601": "2024-09-24T18:57:40Z", "iso8601_basic": "20240924T145740713592", "iso8601_basic_short": "20240924T145740", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_local": {}, "ansible_apparmor": {"status": "disabled"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_lsb": {}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "", "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2848, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 869, "free": 2848}, "nocache": {"free": 3478, "used": 239}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 764, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146924032, "block_size": 4096, "block_total": 64479564, "block_available": 61315167, "block_used": 3164397, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_service_mgr": "systemd", "ansible_interfaces": ["lo", "eth0", "peerlsr27", "lsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "1a:7c:88:07:95:bb", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv4": {"address": "192.0.2.1", "broadcast": "192.0.2.255", "netmask": "255.255.255.0", "network": "192.0.2.0", "prefix": "24"}, "ipv6": [{"address": "fe80::7aca:905f:e6:3e16", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "02:23:6d:58:c2:b5", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::23:6dff:fe58:c2b5", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159", "192.0.2.1"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8", "fe80::7aca:905f:e6:3e16", "fe80::23:6dff:fe58:c2b5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1", "192.0.2.1"], "ipv6": ["::1", "fe80::23:6dff:fe58:c2b5", "fe80::4a44:1e77:128f:34e8", "fe80::7aca:905f:e6:3e16"]}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204261.11996: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204260.2566767-23870-143522146365253/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204261.12000: _low_level_execute_command(): starting 22736 1727204261.12003: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204260.2566767-23870-143522146365253/ > /dev/null 2>&1 && sleep 0' 22736 1727204261.12841: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204261.12941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204261.12980: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204261.13009: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204261.13025: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204261.13106: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204261.15184: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204261.15208: stdout chunk (state=3): >>><<< 22736 1727204261.15222: stderr chunk (state=3): >>><<< 22736 1727204261.15396: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204261.15400: handler run complete 22736 1727204261.15471: variable 'ansible_facts' from source: unknown 22736 1727204261.15638: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204261.16178: variable 'ansible_facts' from source: unknown 22736 1727204261.16314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204261.16532: attempt loop complete, returning result 22736 1727204261.16543: _execute() done 22736 1727204261.16551: dumping result to json 22736 1727204261.16592: done dumping result, returning 22736 1727204261.16619: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-4f4a-548a-00000000032b] 22736 1727204261.16630: sending task result for task 12b410aa-8751-4f4a-548a-00000000032b 22736 1727204261.17433: done sending task result for task 12b410aa-8751-4f4a-548a-00000000032b 22736 1727204261.17436: WORKER PROCESS EXITING ok: [managed-node2] 22736 1727204261.17814: no more pending results, returning what we have 22736 1727204261.17818: results queue empty 22736 1727204261.17819: checking for any_errors_fatal 22736 1727204261.17820: done checking for any_errors_fatal 22736 1727204261.17821: checking for max_fail_percentage 22736 1727204261.17823: done checking for max_fail_percentage 22736 1727204261.17824: checking to see if all hosts have failed and the running result is not ok 22736 1727204261.17826: done checking to see if all hosts have failed 22736 1727204261.17826: getting the remaining hosts for this loop 22736 1727204261.17828: done getting the remaining hosts for this loop 22736 1727204261.17832: getting the next task for host managed-node2 22736 1727204261.17839: done getting next task for host managed-node2 22736 1727204261.17841: ^ task is: TASK: meta (flush_handlers) 22736 1727204261.17843: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204261.17848: getting variables 22736 1727204261.17850: in VariableManager get_vars() 22736 1727204261.17886: Calling all_inventory to load vars for managed-node2 22736 1727204261.17892: Calling groups_inventory to load vars for managed-node2 22736 1727204261.17895: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204261.17908: Calling all_plugins_play to load vars for managed-node2 22736 1727204261.17921: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204261.17925: Calling groups_plugins_play to load vars for managed-node2 22736 1727204261.20372: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204261.23467: done with get_vars() 22736 1727204261.23520: done getting variables 22736 1727204261.23612: in VariableManager get_vars() 22736 1727204261.23630: Calling all_inventory to load vars for managed-node2 22736 1727204261.23633: Calling groups_inventory to load vars for managed-node2 22736 1727204261.23636: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204261.23642: Calling all_plugins_play to load vars for managed-node2 22736 1727204261.23645: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204261.23649: Calling groups_plugins_play to load vars for managed-node2 22736 1727204261.25780: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204261.29011: done with get_vars() 22736 1727204261.29054: done queuing things up, now waiting for results queue to drain 22736 1727204261.29057: results queue empty 22736 1727204261.29058: checking for any_errors_fatal 22736 1727204261.29064: done checking for any_errors_fatal 22736 1727204261.29065: checking for max_fail_percentage 22736 1727204261.29067: done checking for max_fail_percentage 22736 1727204261.29068: checking to see if all hosts have failed and the running result is not ok 22736 1727204261.29069: done checking to see if all hosts have failed 22736 1727204261.29075: getting the remaining hosts for this loop 22736 1727204261.29085: done getting the remaining hosts for this loop 22736 1727204261.29090: getting the next task for host managed-node2 22736 1727204261.29096: done getting next task for host managed-node2 22736 1727204261.29100: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22736 1727204261.29102: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204261.29114: getting variables 22736 1727204261.29115: in VariableManager get_vars() 22736 1727204261.29134: Calling all_inventory to load vars for managed-node2 22736 1727204261.29137: Calling groups_inventory to load vars for managed-node2 22736 1727204261.29139: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204261.29146: Calling all_plugins_play to load vars for managed-node2 22736 1727204261.29149: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204261.29153: Calling groups_plugins_play to load vars for managed-node2 22736 1727204261.31260: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204261.34425: done with get_vars() 22736 1727204261.34481: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:41 -0400 (0:00:01.144) 0:00:26.130 ***** 22736 1727204261.34588: entering _queue_task() for managed-node2/include_tasks 22736 1727204261.35005: worker is 1 (out of 1 available) 22736 1727204261.35019: exiting _queue_task() for managed-node2/include_tasks 22736 1727204261.35035: done queuing things up, now waiting for results queue to drain 22736 1727204261.35037: waiting for pending results... 22736 1727204261.35357: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22736 1727204261.35519: in run() - task 12b410aa-8751-4f4a-548a-00000000003c 22736 1727204261.35550: variable 'ansible_search_path' from source: unknown 22736 1727204261.35561: variable 'ansible_search_path' from source: unknown 22736 1727204261.35614: calling self._execute() 22736 1727204261.35737: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204261.35757: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204261.35771: variable 'omit' from source: magic vars 22736 1727204261.36496: variable 'ansible_distribution_major_version' from source: facts 22736 1727204261.36516: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204261.36531: _execute() done 22736 1727204261.36540: dumping result to json 22736 1727204261.36695: done dumping result, returning 22736 1727204261.36700: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-4f4a-548a-00000000003c] 22736 1727204261.36702: sending task result for task 12b410aa-8751-4f4a-548a-00000000003c 22736 1727204261.37103: no more pending results, returning what we have 22736 1727204261.37110: in VariableManager get_vars() 22736 1727204261.37169: Calling all_inventory to load vars for managed-node2 22736 1727204261.37173: Calling groups_inventory to load vars for managed-node2 22736 1727204261.37176: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204261.37196: Calling all_plugins_play to load vars for managed-node2 22736 1727204261.37201: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204261.37206: Calling groups_plugins_play to load vars for managed-node2 22736 1727204261.37868: done sending task result for task 12b410aa-8751-4f4a-548a-00000000003c 22736 1727204261.37872: WORKER PROCESS EXITING 22736 1727204261.41152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204261.44235: done with get_vars() 22736 1727204261.44272: variable 'ansible_search_path' from source: unknown 22736 1727204261.44274: variable 'ansible_search_path' from source: unknown 22736 1727204261.44320: we have included files to process 22736 1727204261.44321: generating all_blocks data 22736 1727204261.44323: done generating all_blocks data 22736 1727204261.44324: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22736 1727204261.44326: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22736 1727204261.44328: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22736 1727204261.45131: done processing included file 22736 1727204261.45134: iterating over new_blocks loaded from include file 22736 1727204261.45136: in VariableManager get_vars() 22736 1727204261.45175: done with get_vars() 22736 1727204261.45178: filtering new block on tags 22736 1727204261.45202: done filtering new block on tags 22736 1727204261.45206: in VariableManager get_vars() 22736 1727204261.45234: done with get_vars() 22736 1727204261.45236: filtering new block on tags 22736 1727204261.45262: done filtering new block on tags 22736 1727204261.45271: in VariableManager get_vars() 22736 1727204261.45305: done with get_vars() 22736 1727204261.45307: filtering new block on tags 22736 1727204261.45329: done filtering new block on tags 22736 1727204261.45331: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 22736 1727204261.45338: extending task lists for all hosts with included blocks 22736 1727204261.45947: done extending task lists 22736 1727204261.45949: done processing included files 22736 1727204261.45950: results queue empty 22736 1727204261.45951: checking for any_errors_fatal 22736 1727204261.45953: done checking for any_errors_fatal 22736 1727204261.45954: checking for max_fail_percentage 22736 1727204261.45955: done checking for max_fail_percentage 22736 1727204261.45956: checking to see if all hosts have failed and the running result is not ok 22736 1727204261.45957: done checking to see if all hosts have failed 22736 1727204261.45958: getting the remaining hosts for this loop 22736 1727204261.45960: done getting the remaining hosts for this loop 22736 1727204261.45963: getting the next task for host managed-node2 22736 1727204261.45968: done getting next task for host managed-node2 22736 1727204261.45971: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22736 1727204261.45974: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204261.45984: getting variables 22736 1727204261.45986: in VariableManager get_vars() 22736 1727204261.46005: Calling all_inventory to load vars for managed-node2 22736 1727204261.46008: Calling groups_inventory to load vars for managed-node2 22736 1727204261.46011: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204261.46018: Calling all_plugins_play to load vars for managed-node2 22736 1727204261.46021: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204261.46025: Calling groups_plugins_play to load vars for managed-node2 22736 1727204261.48246: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204261.51322: done with get_vars() 22736 1727204261.51361: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.168) 0:00:26.299 ***** 22736 1727204261.51471: entering _queue_task() for managed-node2/setup 22736 1727204261.51897: worker is 1 (out of 1 available) 22736 1727204261.51913: exiting _queue_task() for managed-node2/setup 22736 1727204261.51935: done queuing things up, now waiting for results queue to drain 22736 1727204261.51937: waiting for pending results... 22736 1727204261.52226: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22736 1727204261.52377: in run() - task 12b410aa-8751-4f4a-548a-00000000036c 22736 1727204261.52485: variable 'ansible_search_path' from source: unknown 22736 1727204261.52490: variable 'ansible_search_path' from source: unknown 22736 1727204261.52495: calling self._execute() 22736 1727204261.52554: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204261.52563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204261.52575: variable 'omit' from source: magic vars 22736 1727204261.53052: variable 'ansible_distribution_major_version' from source: facts 22736 1727204261.53067: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204261.53368: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204261.56077: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204261.56185: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204261.56202: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204261.56248: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204261.56306: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204261.56381: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204261.56433: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204261.56498: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204261.56508: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204261.56526: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204261.56593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204261.56625: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204261.56655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204261.56743: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204261.56747: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204261.57299: variable '__network_required_facts' from source: role '' defaults 22736 1727204261.57416: variable 'ansible_facts' from source: unknown 22736 1727204261.58595: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 22736 1727204261.58606: when evaluation is False, skipping this task 22736 1727204261.58623: _execute() done 22736 1727204261.58637: dumping result to json 22736 1727204261.58695: done dumping result, returning 22736 1727204261.58699: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-4f4a-548a-00000000036c] 22736 1727204261.58701: sending task result for task 12b410aa-8751-4f4a-548a-00000000036c skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22736 1727204261.58905: no more pending results, returning what we have 22736 1727204261.58911: results queue empty 22736 1727204261.58912: checking for any_errors_fatal 22736 1727204261.58914: done checking for any_errors_fatal 22736 1727204261.58915: checking for max_fail_percentage 22736 1727204261.58917: done checking for max_fail_percentage 22736 1727204261.58918: checking to see if all hosts have failed and the running result is not ok 22736 1727204261.58919: done checking to see if all hosts have failed 22736 1727204261.58920: getting the remaining hosts for this loop 22736 1727204261.58922: done getting the remaining hosts for this loop 22736 1727204261.58927: getting the next task for host managed-node2 22736 1727204261.58939: done getting next task for host managed-node2 22736 1727204261.58944: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 22736 1727204261.58948: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204261.58968: getting variables 22736 1727204261.58970: in VariableManager get_vars() 22736 1727204261.59021: Calling all_inventory to load vars for managed-node2 22736 1727204261.59025: Calling groups_inventory to load vars for managed-node2 22736 1727204261.59028: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204261.59043: Calling all_plugins_play to load vars for managed-node2 22736 1727204261.59047: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204261.59051: Calling groups_plugins_play to load vars for managed-node2 22736 1727204261.59611: done sending task result for task 12b410aa-8751-4f4a-548a-00000000036c 22736 1727204261.59615: WORKER PROCESS EXITING 22736 1727204261.61737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204261.64771: done with get_vars() 22736 1727204261.64823: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.134) 0:00:26.434 ***** 22736 1727204261.64950: entering _queue_task() for managed-node2/stat 22736 1727204261.65340: worker is 1 (out of 1 available) 22736 1727204261.65355: exiting _queue_task() for managed-node2/stat 22736 1727204261.65369: done queuing things up, now waiting for results queue to drain 22736 1727204261.65370: waiting for pending results... 22736 1727204261.65622: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 22736 1727204261.65788: in run() - task 12b410aa-8751-4f4a-548a-00000000036e 22736 1727204261.65830: variable 'ansible_search_path' from source: unknown 22736 1727204261.65838: variable 'ansible_search_path' from source: unknown 22736 1727204261.65888: calling self._execute() 22736 1727204261.66013: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204261.66029: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204261.66059: variable 'omit' from source: magic vars 22736 1727204261.66544: variable 'ansible_distribution_major_version' from source: facts 22736 1727204261.66593: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204261.66797: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204261.67156: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204261.67243: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204261.67277: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204261.67324: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204261.67461: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204261.67493: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204261.67569: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204261.67579: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204261.67694: variable '__network_is_ostree' from source: set_fact 22736 1727204261.67708: Evaluated conditional (not __network_is_ostree is defined): False 22736 1727204261.67716: when evaluation is False, skipping this task 22736 1727204261.67723: _execute() done 22736 1727204261.67730: dumping result to json 22736 1727204261.67738: done dumping result, returning 22736 1727204261.67750: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-4f4a-548a-00000000036e] 22736 1727204261.67786: sending task result for task 12b410aa-8751-4f4a-548a-00000000036e skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22736 1727204261.67949: no more pending results, returning what we have 22736 1727204261.67954: results queue empty 22736 1727204261.67955: checking for any_errors_fatal 22736 1727204261.67962: done checking for any_errors_fatal 22736 1727204261.67963: checking for max_fail_percentage 22736 1727204261.67965: done checking for max_fail_percentage 22736 1727204261.67966: checking to see if all hosts have failed and the running result is not ok 22736 1727204261.67967: done checking to see if all hosts have failed 22736 1727204261.67968: getting the remaining hosts for this loop 22736 1727204261.67970: done getting the remaining hosts for this loop 22736 1727204261.67974: getting the next task for host managed-node2 22736 1727204261.67982: done getting next task for host managed-node2 22736 1727204261.67986: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22736 1727204261.68091: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204261.68115: done sending task result for task 12b410aa-8751-4f4a-548a-00000000036e 22736 1727204261.68126: WORKER PROCESS EXITING 22736 1727204261.68133: getting variables 22736 1727204261.68135: in VariableManager get_vars() 22736 1727204261.68173: Calling all_inventory to load vars for managed-node2 22736 1727204261.68176: Calling groups_inventory to load vars for managed-node2 22736 1727204261.68178: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204261.68188: Calling all_plugins_play to load vars for managed-node2 22736 1727204261.68193: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204261.68197: Calling groups_plugins_play to load vars for managed-node2 22736 1727204261.70477: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204261.73417: done with get_vars() 22736 1727204261.73463: done getting variables 22736 1727204261.73543: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.086) 0:00:26.520 ***** 22736 1727204261.73588: entering _queue_task() for managed-node2/set_fact 22736 1727204261.74193: worker is 1 (out of 1 available) 22736 1727204261.74205: exiting _queue_task() for managed-node2/set_fact 22736 1727204261.74219: done queuing things up, now waiting for results queue to drain 22736 1727204261.74221: waiting for pending results... 22736 1727204261.74321: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22736 1727204261.74471: in run() - task 12b410aa-8751-4f4a-548a-00000000036f 22736 1727204261.74490: variable 'ansible_search_path' from source: unknown 22736 1727204261.74496: variable 'ansible_search_path' from source: unknown 22736 1727204261.74591: calling self._execute() 22736 1727204261.74699: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204261.74706: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204261.74709: variable 'omit' from source: magic vars 22736 1727204261.75107: variable 'ansible_distribution_major_version' from source: facts 22736 1727204261.75137: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204261.75357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204261.75682: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204261.75730: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204261.75792: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204261.75816: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204261.76019: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204261.76023: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204261.76026: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204261.76029: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204261.76146: variable '__network_is_ostree' from source: set_fact 22736 1727204261.76150: Evaluated conditional (not __network_is_ostree is defined): False 22736 1727204261.76153: when evaluation is False, skipping this task 22736 1727204261.76156: _execute() done 22736 1727204261.76158: dumping result to json 22736 1727204261.76164: done dumping result, returning 22736 1727204261.76237: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-4f4a-548a-00000000036f] 22736 1727204261.76240: sending task result for task 12b410aa-8751-4f4a-548a-00000000036f skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22736 1727204261.76367: no more pending results, returning what we have 22736 1727204261.76371: results queue empty 22736 1727204261.76373: checking for any_errors_fatal 22736 1727204261.76380: done checking for any_errors_fatal 22736 1727204261.76382: checking for max_fail_percentage 22736 1727204261.76384: done checking for max_fail_percentage 22736 1727204261.76385: checking to see if all hosts have failed and the running result is not ok 22736 1727204261.76386: done checking to see if all hosts have failed 22736 1727204261.76387: getting the remaining hosts for this loop 22736 1727204261.76391: done getting the remaining hosts for this loop 22736 1727204261.76397: getting the next task for host managed-node2 22736 1727204261.76408: done getting next task for host managed-node2 22736 1727204261.76417: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 22736 1727204261.76421: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204261.76440: getting variables 22736 1727204261.76442: in VariableManager get_vars() 22736 1727204261.76488: Calling all_inventory to load vars for managed-node2 22736 1727204261.76727: Calling groups_inventory to load vars for managed-node2 22736 1727204261.76730: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204261.76742: Calling all_plugins_play to load vars for managed-node2 22736 1727204261.76747: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204261.76752: Calling groups_plugins_play to load vars for managed-node2 22736 1727204261.77504: done sending task result for task 12b410aa-8751-4f4a-548a-00000000036f 22736 1727204261.77509: WORKER PROCESS EXITING 22736 1727204261.79002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204261.82245: done with get_vars() 22736 1727204261.82283: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:57:41 -0400 (0:00:00.088) 0:00:26.609 ***** 22736 1727204261.82418: entering _queue_task() for managed-node2/service_facts 22736 1727204261.82841: worker is 1 (out of 1 available) 22736 1727204261.82856: exiting _queue_task() for managed-node2/service_facts 22736 1727204261.82870: done queuing things up, now waiting for results queue to drain 22736 1727204261.82872: waiting for pending results... 22736 1727204261.83444: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 22736 1727204261.83522: in run() - task 12b410aa-8751-4f4a-548a-000000000371 22736 1727204261.83547: variable 'ansible_search_path' from source: unknown 22736 1727204261.83556: variable 'ansible_search_path' from source: unknown 22736 1727204261.83677: calling self._execute() 22736 1727204261.83726: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204261.83740: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204261.83756: variable 'omit' from source: magic vars 22736 1727204261.84232: variable 'ansible_distribution_major_version' from source: facts 22736 1727204261.84250: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204261.84261: variable 'omit' from source: magic vars 22736 1727204261.84358: variable 'omit' from source: magic vars 22736 1727204261.84415: variable 'omit' from source: magic vars 22736 1727204261.84545: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204261.84551: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204261.84670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204261.84673: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204261.84676: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204261.84678: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204261.84681: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204261.84683: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204261.84825: Set connection var ansible_timeout to 10 22736 1727204261.84846: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204261.84863: Set connection var ansible_shell_executable to /bin/sh 22736 1727204261.84872: Set connection var ansible_shell_type to sh 22736 1727204261.84893: Set connection var ansible_pipelining to False 22736 1727204261.84907: Set connection var ansible_connection to ssh 22736 1727204261.84939: variable 'ansible_shell_executable' from source: unknown 22736 1727204261.84998: variable 'ansible_connection' from source: unknown 22736 1727204261.85002: variable 'ansible_module_compression' from source: unknown 22736 1727204261.85005: variable 'ansible_shell_type' from source: unknown 22736 1727204261.85007: variable 'ansible_shell_executable' from source: unknown 22736 1727204261.85014: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204261.85017: variable 'ansible_pipelining' from source: unknown 22736 1727204261.85019: variable 'ansible_timeout' from source: unknown 22736 1727204261.85021: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204261.85272: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204261.85296: variable 'omit' from source: magic vars 22736 1727204261.85307: starting attempt loop 22736 1727204261.85328: running the handler 22736 1727204261.85435: _low_level_execute_command(): starting 22736 1727204261.85439: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204261.86229: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204261.86316: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204261.86343: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204261.86405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204261.86465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204261.88234: stdout chunk (state=3): >>>/root <<< 22736 1727204261.88609: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204261.88645: stderr chunk (state=3): >>><<< 22736 1727204261.88659: stdout chunk (state=3): >>><<< 22736 1727204261.88687: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204261.88716: _low_level_execute_command(): starting 22736 1727204261.88730: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204261.8869896-23922-144872745191387 `" && echo ansible-tmp-1727204261.8869896-23922-144872745191387="` echo /root/.ansible/tmp/ansible-tmp-1727204261.8869896-23922-144872745191387 `" ) && sleep 0' 22736 1727204261.89525: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22736 1727204261.89551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204261.89572: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204261.89655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204261.91744: stdout chunk (state=3): >>>ansible-tmp-1727204261.8869896-23922-144872745191387=/root/.ansible/tmp/ansible-tmp-1727204261.8869896-23922-144872745191387 <<< 22736 1727204261.91963: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204261.91966: stdout chunk (state=3): >>><<< 22736 1727204261.91969: stderr chunk (state=3): >>><<< 22736 1727204261.92195: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204261.8869896-23922-144872745191387=/root/.ansible/tmp/ansible-tmp-1727204261.8869896-23922-144872745191387 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204261.92199: variable 'ansible_module_compression' from source: unknown 22736 1727204261.92202: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 22736 1727204261.92204: variable 'ansible_facts' from source: unknown 22736 1727204261.92269: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204261.8869896-23922-144872745191387/AnsiballZ_service_facts.py 22736 1727204261.92461: Sending initial data 22736 1727204261.92556: Sent initial data (162 bytes) 22736 1727204261.93138: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204261.93150: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204261.93210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204261.93265: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204261.93278: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204261.93288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204261.93391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204261.95033: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204261.95062: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204261.95141: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpw385id8z /root/.ansible/tmp/ansible-tmp-1727204261.8869896-23922-144872745191387/AnsiballZ_service_facts.py <<< 22736 1727204261.95146: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204261.8869896-23922-144872745191387/AnsiballZ_service_facts.py" <<< 22736 1727204261.95197: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpw385id8z" to remote "/root/.ansible/tmp/ansible-tmp-1727204261.8869896-23922-144872745191387/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204261.8869896-23922-144872745191387/AnsiballZ_service_facts.py" <<< 22736 1727204261.96448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204261.96452: stderr chunk (state=3): >>><<< 22736 1727204261.96473: stdout chunk (state=3): >>><<< 22736 1727204261.96611: done transferring module to remote 22736 1727204261.96618: _low_level_execute_command(): starting 22736 1727204261.96621: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204261.8869896-23922-144872745191387/ /root/.ansible/tmp/ansible-tmp-1727204261.8869896-23922-144872745191387/AnsiballZ_service_facts.py && sleep 0' 22736 1727204261.97248: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204261.97330: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204261.97333: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204261.97348: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204261.97428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204261.99556: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204261.99560: stdout chunk (state=3): >>><<< 22736 1727204261.99706: stderr chunk (state=3): >>><<< 22736 1727204261.99710: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204261.99719: _low_level_execute_command(): starting 22736 1727204261.99722: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204261.8869896-23922-144872745191387/AnsiballZ_service_facts.py && sleep 0' 22736 1727204262.00297: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204262.00305: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204262.00462: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204262.00466: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204262.00515: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204264.03138: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 22736 1727204264.04771: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204264.04881: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 22736 1727204264.04904: stderr chunk (state=3): >>><<< 22736 1727204264.04924: stdout chunk (state=3): >>><<< 22736 1727204264.04957: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204264.06245: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204261.8869896-23922-144872745191387/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204264.06294: _low_level_execute_command(): starting 22736 1727204264.06297: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204261.8869896-23922-144872745191387/ > /dev/null 2>&1 && sleep 0' 22736 1727204264.06975: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204264.06995: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204264.07011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204264.07049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204264.07068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204264.07104: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204264.07126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204264.07225: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204264.07268: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204264.07338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204264.09432: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204264.09436: stdout chunk (state=3): >>><<< 22736 1727204264.09438: stderr chunk (state=3): >>><<< 22736 1727204264.09496: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204264.09499: handler run complete 22736 1727204264.09808: variable 'ansible_facts' from source: unknown 22736 1727204264.10065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204264.10896: variable 'ansible_facts' from source: unknown 22736 1727204264.11193: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204264.11517: attempt loop complete, returning result 22736 1727204264.11529: _execute() done 22736 1727204264.11536: dumping result to json 22736 1727204264.11626: done dumping result, returning 22736 1727204264.11640: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-4f4a-548a-000000000371] 22736 1727204264.11648: sending task result for task 12b410aa-8751-4f4a-548a-000000000371 22736 1727204264.12994: done sending task result for task 12b410aa-8751-4f4a-548a-000000000371 22736 1727204264.12998: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22736 1727204264.13099: no more pending results, returning what we have 22736 1727204264.13102: results queue empty 22736 1727204264.13104: checking for any_errors_fatal 22736 1727204264.13108: done checking for any_errors_fatal 22736 1727204264.13109: checking for max_fail_percentage 22736 1727204264.13110: done checking for max_fail_percentage 22736 1727204264.13111: checking to see if all hosts have failed and the running result is not ok 22736 1727204264.13112: done checking to see if all hosts have failed 22736 1727204264.13115: getting the remaining hosts for this loop 22736 1727204264.13230: done getting the remaining hosts for this loop 22736 1727204264.13235: getting the next task for host managed-node2 22736 1727204264.13241: done getting next task for host managed-node2 22736 1727204264.13245: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 22736 1727204264.13248: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204264.13260: getting variables 22736 1727204264.13261: in VariableManager get_vars() 22736 1727204264.13348: Calling all_inventory to load vars for managed-node2 22736 1727204264.13351: Calling groups_inventory to load vars for managed-node2 22736 1727204264.13354: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204264.13364: Calling all_plugins_play to load vars for managed-node2 22736 1727204264.13368: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204264.13371: Calling groups_plugins_play to load vars for managed-node2 22736 1727204264.16033: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204264.18881: done with get_vars() 22736 1727204264.18928: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:57:44 -0400 (0:00:02.366) 0:00:28.975 ***** 22736 1727204264.19047: entering _queue_task() for managed-node2/package_facts 22736 1727204264.19425: worker is 1 (out of 1 available) 22736 1727204264.19439: exiting _queue_task() for managed-node2/package_facts 22736 1727204264.19452: done queuing things up, now waiting for results queue to drain 22736 1727204264.19453: waiting for pending results... 22736 1727204264.20020: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 22736 1727204264.20025: in run() - task 12b410aa-8751-4f4a-548a-000000000372 22736 1727204264.20028: variable 'ansible_search_path' from source: unknown 22736 1727204264.20030: variable 'ansible_search_path' from source: unknown 22736 1727204264.20033: calling self._execute() 22736 1727204264.20071: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204264.20084: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204264.20103: variable 'omit' from source: magic vars 22736 1727204264.20563: variable 'ansible_distribution_major_version' from source: facts 22736 1727204264.20584: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204264.20601: variable 'omit' from source: magic vars 22736 1727204264.20682: variable 'omit' from source: magic vars 22736 1727204264.20735: variable 'omit' from source: magic vars 22736 1727204264.20791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204264.20841: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204264.20870: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204264.21095: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204264.21098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204264.21101: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204264.21104: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204264.21106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204264.21109: Set connection var ansible_timeout to 10 22736 1727204264.21130: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204264.21147: Set connection var ansible_shell_executable to /bin/sh 22736 1727204264.21156: Set connection var ansible_shell_type to sh 22736 1727204264.21169: Set connection var ansible_pipelining to False 22736 1727204264.21176: Set connection var ansible_connection to ssh 22736 1727204264.21210: variable 'ansible_shell_executable' from source: unknown 22736 1727204264.21224: variable 'ansible_connection' from source: unknown 22736 1727204264.21234: variable 'ansible_module_compression' from source: unknown 22736 1727204264.21242: variable 'ansible_shell_type' from source: unknown 22736 1727204264.21250: variable 'ansible_shell_executable' from source: unknown 22736 1727204264.21258: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204264.21267: variable 'ansible_pipelining' from source: unknown 22736 1727204264.21275: variable 'ansible_timeout' from source: unknown 22736 1727204264.21284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204264.21540: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204264.21565: variable 'omit' from source: magic vars 22736 1727204264.21575: starting attempt loop 22736 1727204264.21583: running the handler 22736 1727204264.21607: _low_level_execute_command(): starting 22736 1727204264.21620: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204264.22410: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204264.22494: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204264.22512: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204264.22595: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204264.24402: stdout chunk (state=3): >>>/root <<< 22736 1727204264.24615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204264.24619: stdout chunk (state=3): >>><<< 22736 1727204264.24622: stderr chunk (state=3): >>><<< 22736 1727204264.24765: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204264.24769: _low_level_execute_command(): starting 22736 1727204264.24773: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204264.2465339-24057-129594159523990 `" && echo ansible-tmp-1727204264.2465339-24057-129594159523990="` echo /root/.ansible/tmp/ansible-tmp-1727204264.2465339-24057-129594159523990 `" ) && sleep 0' 22736 1727204264.25379: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204264.25402: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204264.25421: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204264.25444: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204264.25569: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204264.25601: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204264.25683: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204264.27841: stdout chunk (state=3): >>>ansible-tmp-1727204264.2465339-24057-129594159523990=/root/.ansible/tmp/ansible-tmp-1727204264.2465339-24057-129594159523990 <<< 22736 1727204264.28059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204264.28063: stdout chunk (state=3): >>><<< 22736 1727204264.28066: stderr chunk (state=3): >>><<< 22736 1727204264.28085: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204264.2465339-24057-129594159523990=/root/.ansible/tmp/ansible-tmp-1727204264.2465339-24057-129594159523990 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204264.28162: variable 'ansible_module_compression' from source: unknown 22736 1727204264.28230: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 22736 1727204264.28399: variable 'ansible_facts' from source: unknown 22736 1727204264.28532: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204264.2465339-24057-129594159523990/AnsiballZ_package_facts.py 22736 1727204264.28727: Sending initial data 22736 1727204264.28738: Sent initial data (162 bytes) 22736 1727204264.29506: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204264.29602: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204264.29647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204264.29719: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204264.31473: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204264.31538: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204264.31576: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp_295v8um /root/.ansible/tmp/ansible-tmp-1727204264.2465339-24057-129594159523990/AnsiballZ_package_facts.py <<< 22736 1727204264.31579: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204264.2465339-24057-129594159523990/AnsiballZ_package_facts.py" <<< 22736 1727204264.31644: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp_295v8um" to remote "/root/.ansible/tmp/ansible-tmp-1727204264.2465339-24057-129594159523990/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204264.2465339-24057-129594159523990/AnsiballZ_package_facts.py" <<< 22736 1727204264.35384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204264.35388: stdout chunk (state=3): >>><<< 22736 1727204264.35396: stderr chunk (state=3): >>><<< 22736 1727204264.35398: done transferring module to remote 22736 1727204264.35401: _low_level_execute_command(): starting 22736 1727204264.35403: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204264.2465339-24057-129594159523990/ /root/.ansible/tmp/ansible-tmp-1727204264.2465339-24057-129594159523990/AnsiballZ_package_facts.py && sleep 0' 22736 1727204264.36552: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204264.36608: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204264.36818: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204264.36891: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204264.37137: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204264.39063: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204264.39151: stderr chunk (state=3): >>><<< 22736 1727204264.39168: stdout chunk (state=3): >>><<< 22736 1727204264.39192: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204264.39208: _low_level_execute_command(): starting 22736 1727204264.39300: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204264.2465339-24057-129594159523990/AnsiballZ_package_facts.py && sleep 0' 22736 1727204264.39859: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204264.39971: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204264.39986: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204264.40072: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204264.40112: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204264.40133: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204264.40150: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204264.40263: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204265.05387: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 22736 1727204265.05471: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release"<<< 22736 1727204265.05543: stdout chunk (state=3): >>>: "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 22736 1727204265.05551: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.<<< 22736 1727204265.05582: stdout chunk (state=3): >>>fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name"<<< 22736 1727204265.05629: stdout chunk (state=3): >>>: "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_<<< 22736 1727204265.05648: stdout chunk (state=3): >>>64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 22736 1727204265.07698: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204265.07702: stdout chunk (state=3): >>><<< 22736 1727204265.07704: stderr chunk (state=3): >>><<< 22736 1727204265.07807: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204265.12295: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204264.2465339-24057-129594159523990/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204265.12301: _low_level_execute_command(): starting 22736 1727204265.12304: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204264.2465339-24057-129594159523990/ > /dev/null 2>&1 && sleep 0' 22736 1727204265.12894: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204265.12915: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204265.12918: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204265.12941: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204265.12949: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204265.12957: stderr chunk (state=3): >>>debug2: match not found <<< 22736 1727204265.12969: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204265.12984: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22736 1727204265.12998: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 22736 1727204265.13007: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22736 1727204265.13025: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204265.13037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204265.13055: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204265.13059: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204265.13067: stderr chunk (state=3): >>>debug2: match found <<< 22736 1727204265.13078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204265.13158: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204265.13169: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204265.13231: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204265.13282: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204265.15499: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204265.15502: stdout chunk (state=3): >>><<< 22736 1727204265.15505: stderr chunk (state=3): >>><<< 22736 1727204265.15508: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204265.15510: handler run complete 22736 1727204265.17529: variable 'ansible_facts' from source: unknown 22736 1727204265.18455: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204265.24254: variable 'ansible_facts' from source: unknown 22736 1727204265.25430: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204265.27245: attempt loop complete, returning result 22736 1727204265.27275: _execute() done 22736 1727204265.27285: dumping result to json 22736 1727204265.27891: done dumping result, returning 22736 1727204265.27897: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-4f4a-548a-000000000372] 22736 1727204265.27900: sending task result for task 12b410aa-8751-4f4a-548a-000000000372 22736 1727204265.31975: done sending task result for task 12b410aa-8751-4f4a-548a-000000000372 22736 1727204265.31980: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22736 1727204265.32162: no more pending results, returning what we have 22736 1727204265.32166: results queue empty 22736 1727204265.32167: checking for any_errors_fatal 22736 1727204265.32174: done checking for any_errors_fatal 22736 1727204265.32175: checking for max_fail_percentage 22736 1727204265.32177: done checking for max_fail_percentage 22736 1727204265.32178: checking to see if all hosts have failed and the running result is not ok 22736 1727204265.32179: done checking to see if all hosts have failed 22736 1727204265.32180: getting the remaining hosts for this loop 22736 1727204265.32182: done getting the remaining hosts for this loop 22736 1727204265.32185: getting the next task for host managed-node2 22736 1727204265.32309: done getting next task for host managed-node2 22736 1727204265.32315: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 22736 1727204265.32317: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204265.32330: getting variables 22736 1727204265.32332: in VariableManager get_vars() 22736 1727204265.32368: Calling all_inventory to load vars for managed-node2 22736 1727204265.32371: Calling groups_inventory to load vars for managed-node2 22736 1727204265.32374: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204265.32384: Calling all_plugins_play to load vars for managed-node2 22736 1727204265.32387: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204265.32395: Calling groups_plugins_play to load vars for managed-node2 22736 1727204265.34563: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204265.39434: done with get_vars() 22736 1727204265.39494: done getting variables 22736 1727204265.39574: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:45 -0400 (0:00:01.205) 0:00:30.181 ***** 22736 1727204265.39622: entering _queue_task() for managed-node2/debug 22736 1727204265.40002: worker is 1 (out of 1 available) 22736 1727204265.40018: exiting _queue_task() for managed-node2/debug 22736 1727204265.40032: done queuing things up, now waiting for results queue to drain 22736 1727204265.40033: waiting for pending results... 22736 1727204265.40297: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 22736 1727204265.40440: in run() - task 12b410aa-8751-4f4a-548a-00000000003d 22736 1727204265.40465: variable 'ansible_search_path' from source: unknown 22736 1727204265.40473: variable 'ansible_search_path' from source: unknown 22736 1727204265.40540: calling self._execute() 22736 1727204265.40672: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204265.40688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204265.40720: variable 'omit' from source: magic vars 22736 1727204265.41235: variable 'ansible_distribution_major_version' from source: facts 22736 1727204265.41256: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204265.41282: variable 'omit' from source: magic vars 22736 1727204265.41381: variable 'omit' from source: magic vars 22736 1727204265.41499: variable 'network_provider' from source: set_fact 22736 1727204265.41531: variable 'omit' from source: magic vars 22736 1727204265.41585: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204265.41716: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204265.41720: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204265.41723: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204265.41735: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204265.41775: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204265.41785: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204265.41800: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204265.41962: Set connection var ansible_timeout to 10 22736 1727204265.41985: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204265.42004: Set connection var ansible_shell_executable to /bin/sh 22736 1727204265.42016: Set connection var ansible_shell_type to sh 22736 1727204265.42040: Set connection var ansible_pipelining to False 22736 1727204265.42148: Set connection var ansible_connection to ssh 22736 1727204265.42151: variable 'ansible_shell_executable' from source: unknown 22736 1727204265.42154: variable 'ansible_connection' from source: unknown 22736 1727204265.42156: variable 'ansible_module_compression' from source: unknown 22736 1727204265.42158: variable 'ansible_shell_type' from source: unknown 22736 1727204265.42160: variable 'ansible_shell_executable' from source: unknown 22736 1727204265.42162: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204265.42164: variable 'ansible_pipelining' from source: unknown 22736 1727204265.42166: variable 'ansible_timeout' from source: unknown 22736 1727204265.42168: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204265.42304: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204265.42325: variable 'omit' from source: magic vars 22736 1727204265.42334: starting attempt loop 22736 1727204265.42341: running the handler 22736 1727204265.42407: handler run complete 22736 1727204265.42434: attempt loop complete, returning result 22736 1727204265.42441: _execute() done 22736 1727204265.42448: dumping result to json 22736 1727204265.42455: done dumping result, returning 22736 1727204265.42474: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-4f4a-548a-00000000003d] 22736 1727204265.42495: sending task result for task 12b410aa-8751-4f4a-548a-00000000003d ok: [managed-node2] => {} MSG: Using network provider: nm 22736 1727204265.42777: no more pending results, returning what we have 22736 1727204265.42782: results queue empty 22736 1727204265.42783: checking for any_errors_fatal 22736 1727204265.42996: done checking for any_errors_fatal 22736 1727204265.42998: checking for max_fail_percentage 22736 1727204265.43000: done checking for max_fail_percentage 22736 1727204265.43001: checking to see if all hosts have failed and the running result is not ok 22736 1727204265.43002: done checking to see if all hosts have failed 22736 1727204265.43003: getting the remaining hosts for this loop 22736 1727204265.43005: done getting the remaining hosts for this loop 22736 1727204265.43009: getting the next task for host managed-node2 22736 1727204265.43018: done getting next task for host managed-node2 22736 1727204265.43022: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22736 1727204265.43025: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204265.43038: getting variables 22736 1727204265.43040: in VariableManager get_vars() 22736 1727204265.43079: Calling all_inventory to load vars for managed-node2 22736 1727204265.43082: Calling groups_inventory to load vars for managed-node2 22736 1727204265.43085: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204265.43104: Calling all_plugins_play to load vars for managed-node2 22736 1727204265.43108: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204265.43117: done sending task result for task 12b410aa-8751-4f4a-548a-00000000003d 22736 1727204265.43121: WORKER PROCESS EXITING 22736 1727204265.43126: Calling groups_plugins_play to load vars for managed-node2 22736 1727204265.45704: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204265.48670: done with get_vars() 22736 1727204265.48709: done getting variables 22736 1727204265.48783: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.091) 0:00:30.273 ***** 22736 1727204265.48822: entering _queue_task() for managed-node2/fail 22736 1727204265.49177: worker is 1 (out of 1 available) 22736 1727204265.49296: exiting _queue_task() for managed-node2/fail 22736 1727204265.49309: done queuing things up, now waiting for results queue to drain 22736 1727204265.49310: waiting for pending results... 22736 1727204265.49519: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22736 1727204265.49658: in run() - task 12b410aa-8751-4f4a-548a-00000000003e 22736 1727204265.49681: variable 'ansible_search_path' from source: unknown 22736 1727204265.49696: variable 'ansible_search_path' from source: unknown 22736 1727204265.49766: calling self._execute() 22736 1727204265.49883: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204265.49900: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204265.49921: variable 'omit' from source: magic vars 22736 1727204265.50373: variable 'ansible_distribution_major_version' from source: facts 22736 1727204265.50395: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204265.50564: variable 'network_state' from source: role '' defaults 22736 1727204265.50583: Evaluated conditional (network_state != {}): False 22736 1727204265.50594: when evaluation is False, skipping this task 22736 1727204265.50602: _execute() done 22736 1727204265.50609: dumping result to json 22736 1727204265.50616: done dumping result, returning 22736 1727204265.50630: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-4f4a-548a-00000000003e] 22736 1727204265.50639: sending task result for task 12b410aa-8751-4f4a-548a-00000000003e skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22736 1727204265.50846: no more pending results, returning what we have 22736 1727204265.50851: results queue empty 22736 1727204265.50853: checking for any_errors_fatal 22736 1727204265.50861: done checking for any_errors_fatal 22736 1727204265.50862: checking for max_fail_percentage 22736 1727204265.50864: done checking for max_fail_percentage 22736 1727204265.50865: checking to see if all hosts have failed and the running result is not ok 22736 1727204265.50866: done checking to see if all hosts have failed 22736 1727204265.50868: getting the remaining hosts for this loop 22736 1727204265.50870: done getting the remaining hosts for this loop 22736 1727204265.50875: getting the next task for host managed-node2 22736 1727204265.50882: done getting next task for host managed-node2 22736 1727204265.50886: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22736 1727204265.50892: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204265.50910: getting variables 22736 1727204265.50912: in VariableManager get_vars() 22736 1727204265.50954: Calling all_inventory to load vars for managed-node2 22736 1727204265.50957: Calling groups_inventory to load vars for managed-node2 22736 1727204265.50960: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204265.50975: Calling all_plugins_play to load vars for managed-node2 22736 1727204265.50979: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204265.50983: Calling groups_plugins_play to load vars for managed-node2 22736 1727204265.51199: done sending task result for task 12b410aa-8751-4f4a-548a-00000000003e 22736 1727204265.51203: WORKER PROCESS EXITING 22736 1727204265.53371: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204265.56629: done with get_vars() 22736 1727204265.56679: done getting variables 22736 1727204265.56762: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.079) 0:00:30.352 ***** 22736 1727204265.56816: entering _queue_task() for managed-node2/fail 22736 1727204265.57421: worker is 1 (out of 1 available) 22736 1727204265.57433: exiting _queue_task() for managed-node2/fail 22736 1727204265.57445: done queuing things up, now waiting for results queue to drain 22736 1727204265.57447: waiting for pending results... 22736 1727204265.57610: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22736 1727204265.57756: in run() - task 12b410aa-8751-4f4a-548a-00000000003f 22736 1727204265.57781: variable 'ansible_search_path' from source: unknown 22736 1727204265.57796: variable 'ansible_search_path' from source: unknown 22736 1727204265.57845: calling self._execute() 22736 1727204265.57971: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204265.57987: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204265.58021: variable 'omit' from source: magic vars 22736 1727204265.58702: variable 'ansible_distribution_major_version' from source: facts 22736 1727204265.58726: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204265.58994: variable 'network_state' from source: role '' defaults 22736 1727204265.59022: Evaluated conditional (network_state != {}): False 22736 1727204265.59031: when evaluation is False, skipping this task 22736 1727204265.59040: _execute() done 22736 1727204265.59048: dumping result to json 22736 1727204265.59057: done dumping result, returning 22736 1727204265.59069: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-4f4a-548a-00000000003f] 22736 1727204265.59080: sending task result for task 12b410aa-8751-4f4a-548a-00000000003f 22736 1727204265.59296: done sending task result for task 12b410aa-8751-4f4a-548a-00000000003f 22736 1727204265.59300: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22736 1727204265.59363: no more pending results, returning what we have 22736 1727204265.59368: results queue empty 22736 1727204265.59369: checking for any_errors_fatal 22736 1727204265.59378: done checking for any_errors_fatal 22736 1727204265.59379: checking for max_fail_percentage 22736 1727204265.59381: done checking for max_fail_percentage 22736 1727204265.59383: checking to see if all hosts have failed and the running result is not ok 22736 1727204265.59384: done checking to see if all hosts have failed 22736 1727204265.59385: getting the remaining hosts for this loop 22736 1727204265.59387: done getting the remaining hosts for this loop 22736 1727204265.59393: getting the next task for host managed-node2 22736 1727204265.59404: done getting next task for host managed-node2 22736 1727204265.59408: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22736 1727204265.59411: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204265.59430: getting variables 22736 1727204265.59432: in VariableManager get_vars() 22736 1727204265.59471: Calling all_inventory to load vars for managed-node2 22736 1727204265.59474: Calling groups_inventory to load vars for managed-node2 22736 1727204265.59477: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204265.59600: Calling all_plugins_play to load vars for managed-node2 22736 1727204265.59605: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204265.59610: Calling groups_plugins_play to load vars for managed-node2 22736 1727204265.71192: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204265.75387: done with get_vars() 22736 1727204265.75445: done getting variables 22736 1727204265.75515: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.187) 0:00:30.540 ***** 22736 1727204265.75549: entering _queue_task() for managed-node2/fail 22736 1727204265.75945: worker is 1 (out of 1 available) 22736 1727204265.75959: exiting _queue_task() for managed-node2/fail 22736 1727204265.75973: done queuing things up, now waiting for results queue to drain 22736 1727204265.75975: waiting for pending results... 22736 1727204265.76324: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22736 1727204265.76596: in run() - task 12b410aa-8751-4f4a-548a-000000000040 22736 1727204265.76600: variable 'ansible_search_path' from source: unknown 22736 1727204265.76603: variable 'ansible_search_path' from source: unknown 22736 1727204265.76606: calling self._execute() 22736 1727204265.76655: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204265.76670: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204265.76692: variable 'omit' from source: magic vars 22736 1727204265.77160: variable 'ansible_distribution_major_version' from source: facts 22736 1727204265.77179: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204265.77420: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204265.80077: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204265.80183: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204265.80240: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204265.80283: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204265.80326: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204265.80432: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204265.80475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204265.80596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204265.80600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204265.80603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204265.80751: variable 'ansible_distribution_major_version' from source: facts 22736 1727204265.80776: Evaluated conditional (ansible_distribution_major_version | int > 9): True 22736 1727204265.80952: variable 'ansible_distribution' from source: facts 22736 1727204265.80970: variable '__network_rh_distros' from source: role '' defaults 22736 1727204265.80986: Evaluated conditional (ansible_distribution in __network_rh_distros): False 22736 1727204265.80998: when evaluation is False, skipping this task 22736 1727204265.81006: _execute() done 22736 1727204265.81016: dumping result to json 22736 1727204265.81026: done dumping result, returning 22736 1727204265.81038: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-4f4a-548a-000000000040] 22736 1727204265.81048: sending task result for task 12b410aa-8751-4f4a-548a-000000000040 22736 1727204265.81260: done sending task result for task 12b410aa-8751-4f4a-548a-000000000040 22736 1727204265.81264: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 22736 1727204265.81327: no more pending results, returning what we have 22736 1727204265.81331: results queue empty 22736 1727204265.81332: checking for any_errors_fatal 22736 1727204265.81343: done checking for any_errors_fatal 22736 1727204265.81344: checking for max_fail_percentage 22736 1727204265.81346: done checking for max_fail_percentage 22736 1727204265.81347: checking to see if all hosts have failed and the running result is not ok 22736 1727204265.81349: done checking to see if all hosts have failed 22736 1727204265.81350: getting the remaining hosts for this loop 22736 1727204265.81352: done getting the remaining hosts for this loop 22736 1727204265.81358: getting the next task for host managed-node2 22736 1727204265.81366: done getting next task for host managed-node2 22736 1727204265.81371: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22736 1727204265.81373: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204265.81392: getting variables 22736 1727204265.81395: in VariableManager get_vars() 22736 1727204265.81445: Calling all_inventory to load vars for managed-node2 22736 1727204265.81448: Calling groups_inventory to load vars for managed-node2 22736 1727204265.81451: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204265.81465: Calling all_plugins_play to load vars for managed-node2 22736 1727204265.81469: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204265.81473: Calling groups_plugins_play to load vars for managed-node2 22736 1727204265.83971: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204265.86870: done with get_vars() 22736 1727204265.86915: done getting variables 22736 1727204265.86981: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:45 -0400 (0:00:00.114) 0:00:30.655 ***** 22736 1727204265.87022: entering _queue_task() for managed-node2/dnf 22736 1727204265.87423: worker is 1 (out of 1 available) 22736 1727204265.87438: exiting _queue_task() for managed-node2/dnf 22736 1727204265.87452: done queuing things up, now waiting for results queue to drain 22736 1727204265.87454: waiting for pending results... 22736 1727204265.87793: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22736 1727204265.88096: in run() - task 12b410aa-8751-4f4a-548a-000000000041 22736 1727204265.88100: variable 'ansible_search_path' from source: unknown 22736 1727204265.88103: variable 'ansible_search_path' from source: unknown 22736 1727204265.88107: calling self._execute() 22736 1727204265.88110: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204265.88115: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204265.88119: variable 'omit' from source: magic vars 22736 1727204265.88574: variable 'ansible_distribution_major_version' from source: facts 22736 1727204265.88588: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204265.88871: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204265.92011: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204265.92107: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204265.92155: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204265.92197: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204265.92228: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204265.92331: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204265.92374: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204265.92567: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204265.92571: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204265.92577: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204265.92848: variable 'ansible_distribution' from source: facts 22736 1727204265.92855: variable 'ansible_distribution_major_version' from source: facts 22736 1727204265.92864: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 22736 1727204265.93043: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204265.93196: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204265.93229: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204265.93261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204265.93312: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204265.93330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204265.93386: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204265.93419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204265.93456: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204265.93587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204265.93593: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204265.93596: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204265.93607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204265.93637: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204265.93692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204265.93709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204265.93918: variable 'network_connections' from source: play vars 22736 1727204265.93933: variable 'profile' from source: play vars 22736 1727204265.94024: variable 'profile' from source: play vars 22736 1727204265.94028: variable 'interface' from source: set_fact 22736 1727204265.94102: variable 'interface' from source: set_fact 22736 1727204265.94201: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204265.94395: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204265.94651: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204265.94655: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204265.94658: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204265.94661: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204265.94663: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204265.94673: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204265.94698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204265.94756: variable '__network_team_connections_defined' from source: role '' defaults 22736 1727204265.95103: variable 'network_connections' from source: play vars 22736 1727204265.95109: variable 'profile' from source: play vars 22736 1727204265.95186: variable 'profile' from source: play vars 22736 1727204265.95192: variable 'interface' from source: set_fact 22736 1727204265.95268: variable 'interface' from source: set_fact 22736 1727204265.95299: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22736 1727204265.95304: when evaluation is False, skipping this task 22736 1727204265.95420: _execute() done 22736 1727204265.95424: dumping result to json 22736 1727204265.95427: done dumping result, returning 22736 1727204265.95429: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-4f4a-548a-000000000041] 22736 1727204265.95431: sending task result for task 12b410aa-8751-4f4a-548a-000000000041 22736 1727204265.95500: done sending task result for task 12b410aa-8751-4f4a-548a-000000000041 22736 1727204265.95503: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22736 1727204265.95578: no more pending results, returning what we have 22736 1727204265.95582: results queue empty 22736 1727204265.95583: checking for any_errors_fatal 22736 1727204265.95590: done checking for any_errors_fatal 22736 1727204265.95591: checking for max_fail_percentage 22736 1727204265.95593: done checking for max_fail_percentage 22736 1727204265.95594: checking to see if all hosts have failed and the running result is not ok 22736 1727204265.95595: done checking to see if all hosts have failed 22736 1727204265.95596: getting the remaining hosts for this loop 22736 1727204265.95598: done getting the remaining hosts for this loop 22736 1727204265.95602: getting the next task for host managed-node2 22736 1727204265.95608: done getting next task for host managed-node2 22736 1727204265.95619: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22736 1727204265.95621: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204265.95637: getting variables 22736 1727204265.95639: in VariableManager get_vars() 22736 1727204265.95678: Calling all_inventory to load vars for managed-node2 22736 1727204265.95681: Calling groups_inventory to load vars for managed-node2 22736 1727204265.95683: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204265.95756: Calling all_plugins_play to load vars for managed-node2 22736 1727204265.95762: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204265.95767: Calling groups_plugins_play to load vars for managed-node2 22736 1727204265.98155: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204266.01080: done with get_vars() 22736 1727204266.01120: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22736 1727204266.01213: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.142) 0:00:30.797 ***** 22736 1727204266.01258: entering _queue_task() for managed-node2/yum 22736 1727204266.01657: worker is 1 (out of 1 available) 22736 1727204266.01673: exiting _queue_task() for managed-node2/yum 22736 1727204266.01691: done queuing things up, now waiting for results queue to drain 22736 1727204266.01695: waiting for pending results... 22736 1727204266.02011: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22736 1727204266.02081: in run() - task 12b410aa-8751-4f4a-548a-000000000042 22736 1727204266.02195: variable 'ansible_search_path' from source: unknown 22736 1727204266.02200: variable 'ansible_search_path' from source: unknown 22736 1727204266.02203: calling self._execute() 22736 1727204266.02284: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204266.02303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204266.02329: variable 'omit' from source: magic vars 22736 1727204266.02808: variable 'ansible_distribution_major_version' from source: facts 22736 1727204266.02833: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204266.03082: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204266.05852: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204266.05995: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204266.06009: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204266.06115: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204266.06119: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204266.06201: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.06251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.06288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.06354: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.06378: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.06494: variable 'ansible_distribution_major_version' from source: facts 22736 1727204266.06522: Evaluated conditional (ansible_distribution_major_version | int < 8): False 22736 1727204266.06546: when evaluation is False, skipping this task 22736 1727204266.06550: _execute() done 22736 1727204266.06552: dumping result to json 22736 1727204266.06791: done dumping result, returning 22736 1727204266.06795: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-4f4a-548a-000000000042] 22736 1727204266.06799: sending task result for task 12b410aa-8751-4f4a-548a-000000000042 22736 1727204266.06890: done sending task result for task 12b410aa-8751-4f4a-548a-000000000042 22736 1727204266.06894: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 22736 1727204266.06956: no more pending results, returning what we have 22736 1727204266.06960: results queue empty 22736 1727204266.06961: checking for any_errors_fatal 22736 1727204266.06969: done checking for any_errors_fatal 22736 1727204266.06970: checking for max_fail_percentage 22736 1727204266.06972: done checking for max_fail_percentage 22736 1727204266.06973: checking to see if all hosts have failed and the running result is not ok 22736 1727204266.06975: done checking to see if all hosts have failed 22736 1727204266.06975: getting the remaining hosts for this loop 22736 1727204266.06978: done getting the remaining hosts for this loop 22736 1727204266.06983: getting the next task for host managed-node2 22736 1727204266.06992: done getting next task for host managed-node2 22736 1727204266.06997: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22736 1727204266.07000: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204266.07015: getting variables 22736 1727204266.07018: in VariableManager get_vars() 22736 1727204266.07064: Calling all_inventory to load vars for managed-node2 22736 1727204266.07068: Calling groups_inventory to load vars for managed-node2 22736 1727204266.07071: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204266.07086: Calling all_plugins_play to load vars for managed-node2 22736 1727204266.07095: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204266.07100: Calling groups_plugins_play to load vars for managed-node2 22736 1727204266.09773: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204266.12796: done with get_vars() 22736 1727204266.12833: done getting variables 22736 1727204266.12911: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.116) 0:00:30.914 ***** 22736 1727204266.12947: entering _queue_task() for managed-node2/fail 22736 1727204266.13358: worker is 1 (out of 1 available) 22736 1727204266.13372: exiting _queue_task() for managed-node2/fail 22736 1727204266.13386: done queuing things up, now waiting for results queue to drain 22736 1727204266.13387: waiting for pending results... 22736 1727204266.13687: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22736 1727204266.13810: in run() - task 12b410aa-8751-4f4a-548a-000000000043 22736 1727204266.13826: variable 'ansible_search_path' from source: unknown 22736 1727204266.13830: variable 'ansible_search_path' from source: unknown 22736 1727204266.13880: calling self._execute() 22736 1727204266.13998: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204266.14010: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204266.14024: variable 'omit' from source: magic vars 22736 1727204266.14658: variable 'ansible_distribution_major_version' from source: facts 22736 1727204266.14662: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204266.14666: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204266.14943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204266.17651: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204266.17747: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204266.17787: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204266.17836: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204266.17865: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204266.17955: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.17987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.18027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.18080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.18101: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.18169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.18202: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.18245: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.18296: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.18316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.18373: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.18404: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.18446: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.18494: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.18557: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.18755: variable 'network_connections' from source: play vars 22736 1727204266.18777: variable 'profile' from source: play vars 22736 1727204266.18868: variable 'profile' from source: play vars 22736 1727204266.18886: variable 'interface' from source: set_fact 22736 1727204266.18960: variable 'interface' from source: set_fact 22736 1727204266.19108: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204266.19283: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204266.19339: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204266.19378: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204266.19420: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204266.19477: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204266.19505: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204266.19635: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.19638: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204266.19641: variable '__network_team_connections_defined' from source: role '' defaults 22736 1727204266.20094: variable 'network_connections' from source: play vars 22736 1727204266.20098: variable 'profile' from source: play vars 22736 1727204266.20101: variable 'profile' from source: play vars 22736 1727204266.20103: variable 'interface' from source: set_fact 22736 1727204266.20136: variable 'interface' from source: set_fact 22736 1727204266.20163: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22736 1727204266.20166: when evaluation is False, skipping this task 22736 1727204266.20169: _execute() done 22736 1727204266.20179: dumping result to json 22736 1727204266.20188: done dumping result, returning 22736 1727204266.20198: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-4f4a-548a-000000000043] 22736 1727204266.20208: sending task result for task 12b410aa-8751-4f4a-548a-000000000043 22736 1727204266.20510: done sending task result for task 12b410aa-8751-4f4a-548a-000000000043 22736 1727204266.20517: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22736 1727204266.20568: no more pending results, returning what we have 22736 1727204266.20571: results queue empty 22736 1727204266.20572: checking for any_errors_fatal 22736 1727204266.20578: done checking for any_errors_fatal 22736 1727204266.20579: checking for max_fail_percentage 22736 1727204266.20581: done checking for max_fail_percentage 22736 1727204266.20582: checking to see if all hosts have failed and the running result is not ok 22736 1727204266.20583: done checking to see if all hosts have failed 22736 1727204266.20584: getting the remaining hosts for this loop 22736 1727204266.20585: done getting the remaining hosts for this loop 22736 1727204266.20591: getting the next task for host managed-node2 22736 1727204266.20597: done getting next task for host managed-node2 22736 1727204266.20602: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 22736 1727204266.20604: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204266.20619: getting variables 22736 1727204266.20621: in VariableManager get_vars() 22736 1727204266.20662: Calling all_inventory to load vars for managed-node2 22736 1727204266.20665: Calling groups_inventory to load vars for managed-node2 22736 1727204266.20668: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204266.20678: Calling all_plugins_play to load vars for managed-node2 22736 1727204266.20682: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204266.20685: Calling groups_plugins_play to load vars for managed-node2 22736 1727204266.23029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204266.26069: done with get_vars() 22736 1727204266.26120: done getting variables 22736 1727204266.26196: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.132) 0:00:31.047 ***** 22736 1727204266.26239: entering _queue_task() for managed-node2/package 22736 1727204266.26693: worker is 1 (out of 1 available) 22736 1727204266.26708: exiting _queue_task() for managed-node2/package 22736 1727204266.26721: done queuing things up, now waiting for results queue to drain 22736 1727204266.26722: waiting for pending results... 22736 1727204266.26987: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 22736 1727204266.27119: in run() - task 12b410aa-8751-4f4a-548a-000000000044 22736 1727204266.27133: variable 'ansible_search_path' from source: unknown 22736 1727204266.27137: variable 'ansible_search_path' from source: unknown 22736 1727204266.27177: calling self._execute() 22736 1727204266.27286: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204266.27296: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204266.27315: variable 'omit' from source: magic vars 22736 1727204266.27786: variable 'ansible_distribution_major_version' from source: facts 22736 1727204266.27801: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204266.28079: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204266.28438: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204266.28494: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204266.28547: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204266.29059: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204266.29223: variable 'network_packages' from source: role '' defaults 22736 1727204266.29395: variable '__network_provider_setup' from source: role '' defaults 22736 1727204266.29399: variable '__network_service_name_default_nm' from source: role '' defaults 22736 1727204266.29495: variable '__network_service_name_default_nm' from source: role '' defaults 22736 1727204266.29571: variable '__network_packages_default_nm' from source: role '' defaults 22736 1727204266.29585: variable '__network_packages_default_nm' from source: role '' defaults 22736 1727204266.29866: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204266.32384: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204266.32457: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204266.32516: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204266.32557: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204266.32634: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204266.32698: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.32746: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.32777: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.32860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.32863: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.32917: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.32944: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.33042: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.33046: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.33049: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.33594: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22736 1727204266.33598: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.33601: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.33603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.33636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.33651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.33768: variable 'ansible_python' from source: facts 22736 1727204266.33807: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22736 1727204266.34082: variable '__network_wpa_supplicant_required' from source: role '' defaults 22736 1727204266.34086: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22736 1727204266.34193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.34232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.34261: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.34318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.34340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.34399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.34435: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.34467: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.34519: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.34544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.34737: variable 'network_connections' from source: play vars 22736 1727204266.34745: variable 'profile' from source: play vars 22736 1727204266.34874: variable 'profile' from source: play vars 22736 1727204266.34882: variable 'interface' from source: set_fact 22736 1727204266.34956: variable 'interface' from source: set_fact 22736 1727204266.35042: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204266.35074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204266.35122: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.35161: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204266.35228: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204266.35642: variable 'network_connections' from source: play vars 22736 1727204266.35646: variable 'profile' from source: play vars 22736 1727204266.35773: variable 'profile' from source: play vars 22736 1727204266.35781: variable 'interface' from source: set_fact 22736 1727204266.35874: variable 'interface' from source: set_fact 22736 1727204266.35919: variable '__network_packages_default_wireless' from source: role '' defaults 22736 1727204266.36029: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204266.36465: variable 'network_connections' from source: play vars 22736 1727204266.36595: variable 'profile' from source: play vars 22736 1727204266.36599: variable 'profile' from source: play vars 22736 1727204266.36601: variable 'interface' from source: set_fact 22736 1727204266.36682: variable 'interface' from source: set_fact 22736 1727204266.36723: variable '__network_packages_default_team' from source: role '' defaults 22736 1727204266.36833: variable '__network_team_connections_defined' from source: role '' defaults 22736 1727204266.37271: variable 'network_connections' from source: play vars 22736 1727204266.37284: variable 'profile' from source: play vars 22736 1727204266.37362: variable 'profile' from source: play vars 22736 1727204266.37366: variable 'interface' from source: set_fact 22736 1727204266.37497: variable 'interface' from source: set_fact 22736 1727204266.37591: variable '__network_service_name_default_initscripts' from source: role '' defaults 22736 1727204266.37650: variable '__network_service_name_default_initscripts' from source: role '' defaults 22736 1727204266.37659: variable '__network_packages_default_initscripts' from source: role '' defaults 22736 1727204266.37799: variable '__network_packages_default_initscripts' from source: role '' defaults 22736 1727204266.38072: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22736 1727204266.38737: variable 'network_connections' from source: play vars 22736 1727204266.38786: variable 'profile' from source: play vars 22736 1727204266.38827: variable 'profile' from source: play vars 22736 1727204266.38831: variable 'interface' from source: set_fact 22736 1727204266.38918: variable 'interface' from source: set_fact 22736 1727204266.38926: variable 'ansible_distribution' from source: facts 22736 1727204266.38931: variable '__network_rh_distros' from source: role '' defaults 22736 1727204266.38938: variable 'ansible_distribution_major_version' from source: facts 22736 1727204266.39007: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22736 1727204266.39185: variable 'ansible_distribution' from source: facts 22736 1727204266.39188: variable '__network_rh_distros' from source: role '' defaults 22736 1727204266.39200: variable 'ansible_distribution_major_version' from source: facts 22736 1727204266.39208: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22736 1727204266.39454: variable 'ansible_distribution' from source: facts 22736 1727204266.39460: variable '__network_rh_distros' from source: role '' defaults 22736 1727204266.39468: variable 'ansible_distribution_major_version' from source: facts 22736 1727204266.39517: variable 'network_provider' from source: set_fact 22736 1727204266.39551: variable 'ansible_facts' from source: unknown 22736 1727204266.40795: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 22736 1727204266.40798: when evaluation is False, skipping this task 22736 1727204266.40800: _execute() done 22736 1727204266.40802: dumping result to json 22736 1727204266.40804: done dumping result, returning 22736 1727204266.40806: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-4f4a-548a-000000000044] 22736 1727204266.40808: sending task result for task 12b410aa-8751-4f4a-548a-000000000044 22736 1727204266.40883: done sending task result for task 12b410aa-8751-4f4a-548a-000000000044 22736 1727204266.40887: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 22736 1727204266.40951: no more pending results, returning what we have 22736 1727204266.40955: results queue empty 22736 1727204266.40956: checking for any_errors_fatal 22736 1727204266.41193: done checking for any_errors_fatal 22736 1727204266.41196: checking for max_fail_percentage 22736 1727204266.41198: done checking for max_fail_percentage 22736 1727204266.41199: checking to see if all hosts have failed and the running result is not ok 22736 1727204266.41200: done checking to see if all hosts have failed 22736 1727204266.41201: getting the remaining hosts for this loop 22736 1727204266.41203: done getting the remaining hosts for this loop 22736 1727204266.41207: getting the next task for host managed-node2 22736 1727204266.41213: done getting next task for host managed-node2 22736 1727204266.41218: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22736 1727204266.41220: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204266.41234: getting variables 22736 1727204266.41236: in VariableManager get_vars() 22736 1727204266.41275: Calling all_inventory to load vars for managed-node2 22736 1727204266.41278: Calling groups_inventory to load vars for managed-node2 22736 1727204266.41280: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204266.41302: Calling all_plugins_play to load vars for managed-node2 22736 1727204266.41306: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204266.41310: Calling groups_plugins_play to load vars for managed-node2 22736 1727204266.43715: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204266.46741: done with get_vars() 22736 1727204266.46781: done getting variables 22736 1727204266.46862: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.206) 0:00:31.253 ***** 22736 1727204266.46902: entering _queue_task() for managed-node2/package 22736 1727204266.47316: worker is 1 (out of 1 available) 22736 1727204266.47330: exiting _queue_task() for managed-node2/package 22736 1727204266.47345: done queuing things up, now waiting for results queue to drain 22736 1727204266.47346: waiting for pending results... 22736 1727204266.47909: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22736 1727204266.47916: in run() - task 12b410aa-8751-4f4a-548a-000000000045 22736 1727204266.47920: variable 'ansible_search_path' from source: unknown 22736 1727204266.47923: variable 'ansible_search_path' from source: unknown 22736 1727204266.47926: calling self._execute() 22736 1727204266.47940: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204266.47951: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204266.47962: variable 'omit' from source: magic vars 22736 1727204266.48437: variable 'ansible_distribution_major_version' from source: facts 22736 1727204266.48451: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204266.48611: variable 'network_state' from source: role '' defaults 22736 1727204266.48624: Evaluated conditional (network_state != {}): False 22736 1727204266.48628: when evaluation is False, skipping this task 22736 1727204266.48631: _execute() done 22736 1727204266.48635: dumping result to json 22736 1727204266.48640: done dumping result, returning 22736 1727204266.48649: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-4f4a-548a-000000000045] 22736 1727204266.48655: sending task result for task 12b410aa-8751-4f4a-548a-000000000045 22736 1727204266.48772: done sending task result for task 12b410aa-8751-4f4a-548a-000000000045 22736 1727204266.48775: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22736 1727204266.48860: no more pending results, returning what we have 22736 1727204266.48864: results queue empty 22736 1727204266.48866: checking for any_errors_fatal 22736 1727204266.48877: done checking for any_errors_fatal 22736 1727204266.48878: checking for max_fail_percentage 22736 1727204266.48880: done checking for max_fail_percentage 22736 1727204266.48882: checking to see if all hosts have failed and the running result is not ok 22736 1727204266.48883: done checking to see if all hosts have failed 22736 1727204266.48884: getting the remaining hosts for this loop 22736 1727204266.48886: done getting the remaining hosts for this loop 22736 1727204266.48893: getting the next task for host managed-node2 22736 1727204266.49094: done getting next task for host managed-node2 22736 1727204266.49100: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22736 1727204266.49103: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204266.49119: getting variables 22736 1727204266.49121: in VariableManager get_vars() 22736 1727204266.49163: Calling all_inventory to load vars for managed-node2 22736 1727204266.49166: Calling groups_inventory to load vars for managed-node2 22736 1727204266.49169: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204266.49181: Calling all_plugins_play to load vars for managed-node2 22736 1727204266.49185: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204266.49195: Calling groups_plugins_play to load vars for managed-node2 22736 1727204266.51608: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204266.54697: done with get_vars() 22736 1727204266.54746: done getting variables 22736 1727204266.54825: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.079) 0:00:31.333 ***** 22736 1727204266.54865: entering _queue_task() for managed-node2/package 22736 1727204266.55261: worker is 1 (out of 1 available) 22736 1727204266.55277: exiting _queue_task() for managed-node2/package 22736 1727204266.55294: done queuing things up, now waiting for results queue to drain 22736 1727204266.55296: waiting for pending results... 22736 1727204266.55816: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22736 1727204266.55845: in run() - task 12b410aa-8751-4f4a-548a-000000000046 22736 1727204266.55871: variable 'ansible_search_path' from source: unknown 22736 1727204266.55880: variable 'ansible_search_path' from source: unknown 22736 1727204266.55938: calling self._execute() 22736 1727204266.56060: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204266.56075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204266.56095: variable 'omit' from source: magic vars 22736 1727204266.56585: variable 'ansible_distribution_major_version' from source: facts 22736 1727204266.56595: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204266.56807: variable 'network_state' from source: role '' defaults 22736 1727204266.56811: Evaluated conditional (network_state != {}): False 22736 1727204266.56816: when evaluation is False, skipping this task 22736 1727204266.56819: _execute() done 22736 1727204266.56821: dumping result to json 22736 1727204266.56824: done dumping result, returning 22736 1727204266.56828: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-4f4a-548a-000000000046] 22736 1727204266.56841: sending task result for task 12b410aa-8751-4f4a-548a-000000000046 22736 1727204266.57150: done sending task result for task 12b410aa-8751-4f4a-548a-000000000046 22736 1727204266.57153: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22736 1727204266.57212: no more pending results, returning what we have 22736 1727204266.57219: results queue empty 22736 1727204266.57220: checking for any_errors_fatal 22736 1727204266.57229: done checking for any_errors_fatal 22736 1727204266.57230: checking for max_fail_percentage 22736 1727204266.57232: done checking for max_fail_percentage 22736 1727204266.57233: checking to see if all hosts have failed and the running result is not ok 22736 1727204266.57234: done checking to see if all hosts have failed 22736 1727204266.57235: getting the remaining hosts for this loop 22736 1727204266.57237: done getting the remaining hosts for this loop 22736 1727204266.57242: getting the next task for host managed-node2 22736 1727204266.57249: done getting next task for host managed-node2 22736 1727204266.57255: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22736 1727204266.57257: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204266.57276: getting variables 22736 1727204266.57278: in VariableManager get_vars() 22736 1727204266.57328: Calling all_inventory to load vars for managed-node2 22736 1727204266.57332: Calling groups_inventory to load vars for managed-node2 22736 1727204266.57335: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204266.57350: Calling all_plugins_play to load vars for managed-node2 22736 1727204266.57355: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204266.57359: Calling groups_plugins_play to load vars for managed-node2 22736 1727204266.59692: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204266.62597: done with get_vars() 22736 1727204266.62648: done getting variables 22736 1727204266.62728: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.078) 0:00:31.412 ***** 22736 1727204266.62768: entering _queue_task() for managed-node2/service 22736 1727204266.63159: worker is 1 (out of 1 available) 22736 1727204266.63174: exiting _queue_task() for managed-node2/service 22736 1727204266.63187: done queuing things up, now waiting for results queue to drain 22736 1727204266.63191: waiting for pending results... 22736 1727204266.63481: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22736 1727204266.63597: in run() - task 12b410aa-8751-4f4a-548a-000000000047 22736 1727204266.63618: variable 'ansible_search_path' from source: unknown 22736 1727204266.63623: variable 'ansible_search_path' from source: unknown 22736 1727204266.63726: calling self._execute() 22736 1727204266.63782: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204266.63793: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204266.63805: variable 'omit' from source: magic vars 22736 1727204266.64271: variable 'ansible_distribution_major_version' from source: facts 22736 1727204266.64284: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204266.64458: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204266.64740: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204266.70486: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204266.70623: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204266.70785: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204266.70825: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204266.70883: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204266.71068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.71232: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.71330: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.71351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.71371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.71653: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.71657: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.71660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.71663: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.71665: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.71771: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.71803: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.71835: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.72018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.72031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.72495: variable 'network_connections' from source: play vars 22736 1727204266.72618: variable 'profile' from source: play vars 22736 1727204266.72807: variable 'profile' from source: play vars 22736 1727204266.72811: variable 'interface' from source: set_fact 22736 1727204266.72963: variable 'interface' from source: set_fact 22736 1727204266.73056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204266.73664: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204266.73724: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204266.73787: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204266.73904: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204266.73960: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204266.73987: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204266.74132: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.74162: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204266.74222: variable '__network_team_connections_defined' from source: role '' defaults 22736 1727204266.74897: variable 'network_connections' from source: play vars 22736 1727204266.74900: variable 'profile' from source: play vars 22736 1727204266.74904: variable 'profile' from source: play vars 22736 1727204266.74906: variable 'interface' from source: set_fact 22736 1727204266.74970: variable 'interface' from source: set_fact 22736 1727204266.75002: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22736 1727204266.75009: when evaluation is False, skipping this task 22736 1727204266.75012: _execute() done 22736 1727204266.75017: dumping result to json 22736 1727204266.75020: done dumping result, returning 22736 1727204266.75055: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-4f4a-548a-000000000047] 22736 1727204266.75065: sending task result for task 12b410aa-8751-4f4a-548a-000000000047 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22736 1727204266.75462: no more pending results, returning what we have 22736 1727204266.75466: results queue empty 22736 1727204266.75467: checking for any_errors_fatal 22736 1727204266.75474: done checking for any_errors_fatal 22736 1727204266.75475: checking for max_fail_percentage 22736 1727204266.75477: done checking for max_fail_percentage 22736 1727204266.75478: checking to see if all hosts have failed and the running result is not ok 22736 1727204266.75479: done checking to see if all hosts have failed 22736 1727204266.75480: getting the remaining hosts for this loop 22736 1727204266.75482: done getting the remaining hosts for this loop 22736 1727204266.75485: getting the next task for host managed-node2 22736 1727204266.75493: done getting next task for host managed-node2 22736 1727204266.75498: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22736 1727204266.75500: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204266.75518: getting variables 22736 1727204266.75520: in VariableManager get_vars() 22736 1727204266.75562: Calling all_inventory to load vars for managed-node2 22736 1727204266.75565: Calling groups_inventory to load vars for managed-node2 22736 1727204266.75568: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204266.75579: Calling all_plugins_play to load vars for managed-node2 22736 1727204266.75583: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204266.75586: Calling groups_plugins_play to load vars for managed-node2 22736 1727204266.75601: done sending task result for task 12b410aa-8751-4f4a-548a-000000000047 22736 1727204266.75604: WORKER PROCESS EXITING 22736 1727204266.77965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204266.81006: done with get_vars() 22736 1727204266.81052: done getting variables 22736 1727204266.81136: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:46 -0400 (0:00:00.184) 0:00:31.596 ***** 22736 1727204266.81174: entering _queue_task() for managed-node2/service 22736 1727204266.81622: worker is 1 (out of 1 available) 22736 1727204266.81636: exiting _queue_task() for managed-node2/service 22736 1727204266.81701: done queuing things up, now waiting for results queue to drain 22736 1727204266.81702: waiting for pending results... 22736 1727204266.82007: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22736 1727204266.82055: in run() - task 12b410aa-8751-4f4a-548a-000000000048 22736 1727204266.82071: variable 'ansible_search_path' from source: unknown 22736 1727204266.82075: variable 'ansible_search_path' from source: unknown 22736 1727204266.82125: calling self._execute() 22736 1727204266.82233: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204266.82240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204266.82252: variable 'omit' from source: magic vars 22736 1727204266.82715: variable 'ansible_distribution_major_version' from source: facts 22736 1727204266.82795: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204266.82938: variable 'network_provider' from source: set_fact 22736 1727204266.83223: variable 'network_state' from source: role '' defaults 22736 1727204266.83226: Evaluated conditional (network_provider == "nm" or network_state != {}): True 22736 1727204266.83229: variable 'omit' from source: magic vars 22736 1727204266.83232: variable 'omit' from source: magic vars 22736 1727204266.83234: variable 'network_service_name' from source: role '' defaults 22736 1727204266.83237: variable 'network_service_name' from source: role '' defaults 22736 1727204266.83317: variable '__network_provider_setup' from source: role '' defaults 22736 1727204266.83322: variable '__network_service_name_default_nm' from source: role '' defaults 22736 1727204266.83407: variable '__network_service_name_default_nm' from source: role '' defaults 22736 1727204266.83418: variable '__network_packages_default_nm' from source: role '' defaults 22736 1727204266.83494: variable '__network_packages_default_nm' from source: role '' defaults 22736 1727204266.83819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204266.86452: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204266.86545: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204266.86597: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204266.86639: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204266.86672: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204266.86776: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.86823: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.86855: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.86920: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.86937: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.87002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.87037: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.87067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.87128: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.87146: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.87509: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22736 1727204266.87664: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.87695: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.87729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.87784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.87804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.87929: variable 'ansible_python' from source: facts 22736 1727204266.87956: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22736 1727204266.88154: variable '__network_wpa_supplicant_required' from source: role '' defaults 22736 1727204266.88196: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22736 1727204266.88395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.88400: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.88402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.88447: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.88462: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.88525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204266.88553: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204266.88584: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.88901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204266.88905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204266.88907: variable 'network_connections' from source: play vars 22736 1727204266.88910: variable 'profile' from source: play vars 22736 1727204266.88928: variable 'profile' from source: play vars 22736 1727204266.88935: variable 'interface' from source: set_fact 22736 1727204266.89018: variable 'interface' from source: set_fact 22736 1727204266.89146: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204266.89602: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204266.89659: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204266.89706: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204266.89752: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204266.89832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204266.89867: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204266.89919: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204266.89959: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204266.90023: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204266.90425: variable 'network_connections' from source: play vars 22736 1727204266.90432: variable 'profile' from source: play vars 22736 1727204266.90531: variable 'profile' from source: play vars 22736 1727204266.90538: variable 'interface' from source: set_fact 22736 1727204266.90622: variable 'interface' from source: set_fact 22736 1727204266.90662: variable '__network_packages_default_wireless' from source: role '' defaults 22736 1727204266.90770: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204266.91177: variable 'network_connections' from source: play vars 22736 1727204266.91183: variable 'profile' from source: play vars 22736 1727204266.91278: variable 'profile' from source: play vars 22736 1727204266.91282: variable 'interface' from source: set_fact 22736 1727204266.91377: variable 'interface' from source: set_fact 22736 1727204266.91496: variable '__network_packages_default_team' from source: role '' defaults 22736 1727204266.91520: variable '__network_team_connections_defined' from source: role '' defaults 22736 1727204266.91922: variable 'network_connections' from source: play vars 22736 1727204266.91929: variable 'profile' from source: play vars 22736 1727204266.92022: variable 'profile' from source: play vars 22736 1727204266.92028: variable 'interface' from source: set_fact 22736 1727204266.92122: variable 'interface' from source: set_fact 22736 1727204266.92191: variable '__network_service_name_default_initscripts' from source: role '' defaults 22736 1727204266.92292: variable '__network_service_name_default_initscripts' from source: role '' defaults 22736 1727204266.92299: variable '__network_packages_default_initscripts' from source: role '' defaults 22736 1727204266.92359: variable '__network_packages_default_initscripts' from source: role '' defaults 22736 1727204266.92725: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22736 1727204266.93373: variable 'network_connections' from source: play vars 22736 1727204266.93376: variable 'profile' from source: play vars 22736 1727204266.93459: variable 'profile' from source: play vars 22736 1727204266.93463: variable 'interface' from source: set_fact 22736 1727204266.93597: variable 'interface' from source: set_fact 22736 1727204266.93600: variable 'ansible_distribution' from source: facts 22736 1727204266.93603: variable '__network_rh_distros' from source: role '' defaults 22736 1727204266.93606: variable 'ansible_distribution_major_version' from source: facts 22736 1727204266.93608: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22736 1727204266.93823: variable 'ansible_distribution' from source: facts 22736 1727204266.93827: variable '__network_rh_distros' from source: role '' defaults 22736 1727204266.93843: variable 'ansible_distribution_major_version' from source: facts 22736 1727204266.93851: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22736 1727204266.94194: variable 'ansible_distribution' from source: facts 22736 1727204266.94198: variable '__network_rh_distros' from source: role '' defaults 22736 1727204266.94201: variable 'ansible_distribution_major_version' from source: facts 22736 1727204266.94203: variable 'network_provider' from source: set_fact 22736 1727204266.94205: variable 'omit' from source: magic vars 22736 1727204266.94217: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204266.94248: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204266.94277: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204266.94303: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204266.94321: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204266.94351: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204266.94355: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204266.94365: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204266.94503: Set connection var ansible_timeout to 10 22736 1727204266.94518: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204266.94526: Set connection var ansible_shell_executable to /bin/sh 22736 1727204266.94529: Set connection var ansible_shell_type to sh 22736 1727204266.94538: Set connection var ansible_pipelining to False 22736 1727204266.94541: Set connection var ansible_connection to ssh 22736 1727204266.94570: variable 'ansible_shell_executable' from source: unknown 22736 1727204266.94573: variable 'ansible_connection' from source: unknown 22736 1727204266.94577: variable 'ansible_module_compression' from source: unknown 22736 1727204266.94579: variable 'ansible_shell_type' from source: unknown 22736 1727204266.94584: variable 'ansible_shell_executable' from source: unknown 22736 1727204266.94587: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204266.94600: variable 'ansible_pipelining' from source: unknown 22736 1727204266.94610: variable 'ansible_timeout' from source: unknown 22736 1727204266.94617: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204266.94757: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204266.94770: variable 'omit' from source: magic vars 22736 1727204266.94776: starting attempt loop 22736 1727204266.94779: running the handler 22736 1727204266.94884: variable 'ansible_facts' from source: unknown 22736 1727204266.96112: _low_level_execute_command(): starting 22736 1727204266.96195: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204266.96979: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204266.97040: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204266.97076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204266.98864: stdout chunk (state=3): >>>/root <<< 22736 1727204266.98998: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204266.99079: stderr chunk (state=3): >>><<< 22736 1727204266.99092: stdout chunk (state=3): >>><<< 22736 1727204266.99130: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204266.99150: _low_level_execute_command(): starting 22736 1727204266.99160: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204266.9913683-24327-87672715982263 `" && echo ansible-tmp-1727204266.9913683-24327-87672715982263="` echo /root/.ansible/tmp/ansible-tmp-1727204266.9913683-24327-87672715982263 `" ) && sleep 0' 22736 1727204266.99878: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204266.99895: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204266.99916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204266.99985: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204267.00053: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204267.00079: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204267.00119: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204267.00201: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204267.02293: stdout chunk (state=3): >>>ansible-tmp-1727204266.9913683-24327-87672715982263=/root/.ansible/tmp/ansible-tmp-1727204266.9913683-24327-87672715982263 <<< 22736 1727204267.02518: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204267.02522: stdout chunk (state=3): >>><<< 22736 1727204267.02525: stderr chunk (state=3): >>><<< 22736 1727204267.02544: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204266.9913683-24327-87672715982263=/root/.ansible/tmp/ansible-tmp-1727204266.9913683-24327-87672715982263 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204267.02695: variable 'ansible_module_compression' from source: unknown 22736 1727204267.02698: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 22736 1727204267.02735: variable 'ansible_facts' from source: unknown 22736 1727204267.02960: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204266.9913683-24327-87672715982263/AnsiballZ_systemd.py 22736 1727204267.03225: Sending initial data 22736 1727204267.03229: Sent initial data (155 bytes) 22736 1727204267.03839: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204267.03861: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204267.03906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204267.03943: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204267.03967: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204267.03985: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204267.04060: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204267.05834: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204267.05926: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204267.05957: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp123gizu0 /root/.ansible/tmp/ansible-tmp-1727204266.9913683-24327-87672715982263/AnsiballZ_systemd.py <<< 22736 1727204267.05987: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204266.9913683-24327-87672715982263/AnsiballZ_systemd.py" <<< 22736 1727204267.06007: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp123gizu0" to remote "/root/.ansible/tmp/ansible-tmp-1727204266.9913683-24327-87672715982263/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204266.9913683-24327-87672715982263/AnsiballZ_systemd.py" <<< 22736 1727204267.08440: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204267.08574: stderr chunk (state=3): >>><<< 22736 1727204267.08587: stdout chunk (state=3): >>><<< 22736 1727204267.08625: done transferring module to remote 22736 1727204267.08643: _low_level_execute_command(): starting 22736 1727204267.08664: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204266.9913683-24327-87672715982263/ /root/.ansible/tmp/ansible-tmp-1727204266.9913683-24327-87672715982263/AnsiballZ_systemd.py && sleep 0' 22736 1727204267.09372: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204267.09404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204267.09515: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204267.09551: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204267.09616: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204267.11694: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204267.11697: stdout chunk (state=3): >>><<< 22736 1727204267.11700: stderr chunk (state=3): >>><<< 22736 1727204267.11703: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204267.11705: _low_level_execute_command(): starting 22736 1727204267.11708: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204266.9913683-24327-87672715982263/AnsiballZ_systemd.py && sleep 0' 22736 1727204267.12495: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204267.12499: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204267.12501: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204267.12505: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204267.12510: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204267.12514: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204267.12517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204267.12572: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204267.45640: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4431872", "MemoryAvailable": "infinity", "CPUUsageNSec": "1307602000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "<<< 22736 1727204267.45647: stdout chunk (state=3): >>>infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 22736 1727204267.47904: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204267.47909: stdout chunk (state=3): >>><<< 22736 1727204267.47912: stderr chunk (state=3): >>><<< 22736 1727204267.47915: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4431872", "MemoryAvailable": "infinity", "CPUUsageNSec": "1307602000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204267.48187: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204266.9913683-24327-87672715982263/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204267.48233: _low_level_execute_command(): starting 22736 1727204267.48246: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204266.9913683-24327-87672715982263/ > /dev/null 2>&1 && sleep 0' 22736 1727204267.48923: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204267.48941: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204267.48959: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204267.49009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204267.49025: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204267.49126: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204267.49159: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204267.49234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204267.51400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204267.51405: stdout chunk (state=3): >>><<< 22736 1727204267.51407: stderr chunk (state=3): >>><<< 22736 1727204267.51410: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204267.51412: handler run complete 22736 1727204267.51460: attempt loop complete, returning result 22736 1727204267.51469: _execute() done 22736 1727204267.51476: dumping result to json 22736 1727204267.51513: done dumping result, returning 22736 1727204267.51533: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-4f4a-548a-000000000048] 22736 1727204267.51542: sending task result for task 12b410aa-8751-4f4a-548a-000000000048 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22736 1727204267.52023: no more pending results, returning what we have 22736 1727204267.52027: results queue empty 22736 1727204267.52028: checking for any_errors_fatal 22736 1727204267.52033: done checking for any_errors_fatal 22736 1727204267.52034: checking for max_fail_percentage 22736 1727204267.52036: done checking for max_fail_percentage 22736 1727204267.52036: checking to see if all hosts have failed and the running result is not ok 22736 1727204267.52038: done checking to see if all hosts have failed 22736 1727204267.52039: getting the remaining hosts for this loop 22736 1727204267.52041: done getting the remaining hosts for this loop 22736 1727204267.52045: getting the next task for host managed-node2 22736 1727204267.52052: done getting next task for host managed-node2 22736 1727204267.52056: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22736 1727204267.52063: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204267.52075: getting variables 22736 1727204267.52077: in VariableManager get_vars() 22736 1727204267.52119: Calling all_inventory to load vars for managed-node2 22736 1727204267.52123: Calling groups_inventory to load vars for managed-node2 22736 1727204267.52125: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204267.52139: Calling all_plugins_play to load vars for managed-node2 22736 1727204267.52142: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204267.52146: Calling groups_plugins_play to load vars for managed-node2 22736 1727204267.52666: done sending task result for task 12b410aa-8751-4f4a-548a-000000000048 22736 1727204267.52671: WORKER PROCESS EXITING 22736 1727204267.53684: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204267.55576: done with get_vars() 22736 1727204267.55601: done getting variables 22736 1727204267.55651: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.745) 0:00:32.341 ***** 22736 1727204267.55678: entering _queue_task() for managed-node2/service 22736 1727204267.55933: worker is 1 (out of 1 available) 22736 1727204267.55948: exiting _queue_task() for managed-node2/service 22736 1727204267.55961: done queuing things up, now waiting for results queue to drain 22736 1727204267.55963: waiting for pending results... 22736 1727204267.56156: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22736 1727204267.56237: in run() - task 12b410aa-8751-4f4a-548a-000000000049 22736 1727204267.56250: variable 'ansible_search_path' from source: unknown 22736 1727204267.56253: variable 'ansible_search_path' from source: unknown 22736 1727204267.56286: calling self._execute() 22736 1727204267.56371: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204267.56376: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204267.56387: variable 'omit' from source: magic vars 22736 1727204267.56719: variable 'ansible_distribution_major_version' from source: facts 22736 1727204267.56732: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204267.56838: variable 'network_provider' from source: set_fact 22736 1727204267.56842: Evaluated conditional (network_provider == "nm"): True 22736 1727204267.56924: variable '__network_wpa_supplicant_required' from source: role '' defaults 22736 1727204267.57000: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22736 1727204267.57152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204267.58819: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204267.58873: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204267.58906: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204267.58941: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204267.58964: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204267.59048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204267.59073: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204267.59097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204267.59133: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204267.59151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204267.59193: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204267.59213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204267.59236: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204267.59270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204267.59282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204267.59321: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204267.59340: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204267.59363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204267.59395: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204267.59408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204267.59544: variable 'network_connections' from source: play vars 22736 1727204267.59556: variable 'profile' from source: play vars 22736 1727204267.59624: variable 'profile' from source: play vars 22736 1727204267.59628: variable 'interface' from source: set_fact 22736 1727204267.59681: variable 'interface' from source: set_fact 22736 1727204267.59757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204267.59894: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204267.59934: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204267.59961: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204267.59987: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204267.60034: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204267.60060: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204267.60081: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204267.60105: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204267.60152: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204267.60370: variable 'network_connections' from source: play vars 22736 1727204267.60376: variable 'profile' from source: play vars 22736 1727204267.60431: variable 'profile' from source: play vars 22736 1727204267.60434: variable 'interface' from source: set_fact 22736 1727204267.60486: variable 'interface' from source: set_fact 22736 1727204267.60514: Evaluated conditional (__network_wpa_supplicant_required): False 22736 1727204267.60520: when evaluation is False, skipping this task 22736 1727204267.60524: _execute() done 22736 1727204267.60535: dumping result to json 22736 1727204267.60538: done dumping result, returning 22736 1727204267.60541: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-4f4a-548a-000000000049] 22736 1727204267.60543: sending task result for task 12b410aa-8751-4f4a-548a-000000000049 22736 1727204267.60639: done sending task result for task 12b410aa-8751-4f4a-548a-000000000049 22736 1727204267.60643: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 22736 1727204267.60720: no more pending results, returning what we have 22736 1727204267.60724: results queue empty 22736 1727204267.60725: checking for any_errors_fatal 22736 1727204267.60754: done checking for any_errors_fatal 22736 1727204267.60755: checking for max_fail_percentage 22736 1727204267.60756: done checking for max_fail_percentage 22736 1727204267.60757: checking to see if all hosts have failed and the running result is not ok 22736 1727204267.60758: done checking to see if all hosts have failed 22736 1727204267.60759: getting the remaining hosts for this loop 22736 1727204267.60761: done getting the remaining hosts for this loop 22736 1727204267.60766: getting the next task for host managed-node2 22736 1727204267.60774: done getting next task for host managed-node2 22736 1727204267.60778: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 22736 1727204267.60780: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204267.60797: getting variables 22736 1727204267.60798: in VariableManager get_vars() 22736 1727204267.60837: Calling all_inventory to load vars for managed-node2 22736 1727204267.60840: Calling groups_inventory to load vars for managed-node2 22736 1727204267.60843: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204267.60853: Calling all_plugins_play to load vars for managed-node2 22736 1727204267.60856: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204267.60859: Calling groups_plugins_play to load vars for managed-node2 22736 1727204267.62688: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204267.64596: done with get_vars() 22736 1727204267.64621: done getting variables 22736 1727204267.64673: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.090) 0:00:32.431 ***** 22736 1727204267.64701: entering _queue_task() for managed-node2/service 22736 1727204267.64963: worker is 1 (out of 1 available) 22736 1727204267.64978: exiting _queue_task() for managed-node2/service 22736 1727204267.64994: done queuing things up, now waiting for results queue to drain 22736 1727204267.64996: waiting for pending results... 22736 1727204267.65188: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 22736 1727204267.65270: in run() - task 12b410aa-8751-4f4a-548a-00000000004a 22736 1727204267.65281: variable 'ansible_search_path' from source: unknown 22736 1727204267.65285: variable 'ansible_search_path' from source: unknown 22736 1727204267.65323: calling self._execute() 22736 1727204267.65402: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204267.65408: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204267.65420: variable 'omit' from source: magic vars 22736 1727204267.65745: variable 'ansible_distribution_major_version' from source: facts 22736 1727204267.65756: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204267.65858: variable 'network_provider' from source: set_fact 22736 1727204267.65862: Evaluated conditional (network_provider == "initscripts"): False 22736 1727204267.65866: when evaluation is False, skipping this task 22736 1727204267.65873: _execute() done 22736 1727204267.65878: dumping result to json 22736 1727204267.65880: done dumping result, returning 22736 1727204267.65891: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-4f4a-548a-00000000004a] 22736 1727204267.65897: sending task result for task 12b410aa-8751-4f4a-548a-00000000004a 22736 1727204267.65983: done sending task result for task 12b410aa-8751-4f4a-548a-00000000004a 22736 1727204267.65986: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22736 1727204267.66040: no more pending results, returning what we have 22736 1727204267.66043: results queue empty 22736 1727204267.66045: checking for any_errors_fatal 22736 1727204267.66053: done checking for any_errors_fatal 22736 1727204267.66054: checking for max_fail_percentage 22736 1727204267.66056: done checking for max_fail_percentage 22736 1727204267.66058: checking to see if all hosts have failed and the running result is not ok 22736 1727204267.66059: done checking to see if all hosts have failed 22736 1727204267.66059: getting the remaining hosts for this loop 22736 1727204267.66061: done getting the remaining hosts for this loop 22736 1727204267.66066: getting the next task for host managed-node2 22736 1727204267.66072: done getting next task for host managed-node2 22736 1727204267.66076: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22736 1727204267.66079: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204267.66098: getting variables 22736 1727204267.66104: in VariableManager get_vars() 22736 1727204267.66141: Calling all_inventory to load vars for managed-node2 22736 1727204267.66144: Calling groups_inventory to load vars for managed-node2 22736 1727204267.66147: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204267.66157: Calling all_plugins_play to load vars for managed-node2 22736 1727204267.66160: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204267.66164: Calling groups_plugins_play to load vars for managed-node2 22736 1727204267.67320: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204267.68861: done with get_vars() 22736 1727204267.68885: done getting variables 22736 1727204267.68936: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.042) 0:00:32.474 ***** 22736 1727204267.68962: entering _queue_task() for managed-node2/copy 22736 1727204267.69201: worker is 1 (out of 1 available) 22736 1727204267.69217: exiting _queue_task() for managed-node2/copy 22736 1727204267.69231: done queuing things up, now waiting for results queue to drain 22736 1727204267.69232: waiting for pending results... 22736 1727204267.69420: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22736 1727204267.69505: in run() - task 12b410aa-8751-4f4a-548a-00000000004b 22736 1727204267.69521: variable 'ansible_search_path' from source: unknown 22736 1727204267.69525: variable 'ansible_search_path' from source: unknown 22736 1727204267.69558: calling self._execute() 22736 1727204267.69641: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204267.69647: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204267.69657: variable 'omit' from source: magic vars 22736 1727204267.69979: variable 'ansible_distribution_major_version' from source: facts 22736 1727204267.69991: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204267.70094: variable 'network_provider' from source: set_fact 22736 1727204267.70098: Evaluated conditional (network_provider == "initscripts"): False 22736 1727204267.70103: when evaluation is False, skipping this task 22736 1727204267.70107: _execute() done 22736 1727204267.70112: dumping result to json 22736 1727204267.70120: done dumping result, returning 22736 1727204267.70129: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-4f4a-548a-00000000004b] 22736 1727204267.70134: sending task result for task 12b410aa-8751-4f4a-548a-00000000004b 22736 1727204267.70231: done sending task result for task 12b410aa-8751-4f4a-548a-00000000004b 22736 1727204267.70234: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 22736 1727204267.70288: no more pending results, returning what we have 22736 1727204267.70294: results queue empty 22736 1727204267.70295: checking for any_errors_fatal 22736 1727204267.70301: done checking for any_errors_fatal 22736 1727204267.70302: checking for max_fail_percentage 22736 1727204267.70304: done checking for max_fail_percentage 22736 1727204267.70305: checking to see if all hosts have failed and the running result is not ok 22736 1727204267.70306: done checking to see if all hosts have failed 22736 1727204267.70307: getting the remaining hosts for this loop 22736 1727204267.70309: done getting the remaining hosts for this loop 22736 1727204267.70313: getting the next task for host managed-node2 22736 1727204267.70319: done getting next task for host managed-node2 22736 1727204267.70324: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22736 1727204267.70326: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204267.70340: getting variables 22736 1727204267.70342: in VariableManager get_vars() 22736 1727204267.70387: Calling all_inventory to load vars for managed-node2 22736 1727204267.70396: Calling groups_inventory to load vars for managed-node2 22736 1727204267.70399: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204267.70409: Calling all_plugins_play to load vars for managed-node2 22736 1727204267.70412: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204267.70415: Calling groups_plugins_play to load vars for managed-node2 22736 1727204267.71739: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204267.73304: done with get_vars() 22736 1727204267.73328: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:47 -0400 (0:00:00.044) 0:00:32.518 ***** 22736 1727204267.73398: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 22736 1727204267.73643: worker is 1 (out of 1 available) 22736 1727204267.73659: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 22736 1727204267.73673: done queuing things up, now waiting for results queue to drain 22736 1727204267.73675: waiting for pending results... 22736 1727204267.73867: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22736 1727204267.73956: in run() - task 12b410aa-8751-4f4a-548a-00000000004c 22736 1727204267.73969: variable 'ansible_search_path' from source: unknown 22736 1727204267.73973: variable 'ansible_search_path' from source: unknown 22736 1727204267.74007: calling self._execute() 22736 1727204267.74088: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204267.74129: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204267.74133: variable 'omit' from source: magic vars 22736 1727204267.74430: variable 'ansible_distribution_major_version' from source: facts 22736 1727204267.74440: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204267.74449: variable 'omit' from source: magic vars 22736 1727204267.74483: variable 'omit' from source: magic vars 22736 1727204267.74630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204267.77197: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204267.77318: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204267.77322: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204267.77373: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204267.77425: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204267.77542: variable 'network_provider' from source: set_fact 22736 1727204267.77729: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204267.77799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204267.77963: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204267.77966: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204267.77969: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204267.78096: variable 'omit' from source: magic vars 22736 1727204267.78254: variable 'omit' from source: magic vars 22736 1727204267.78404: variable 'network_connections' from source: play vars 22736 1727204267.78431: variable 'profile' from source: play vars 22736 1727204267.78529: variable 'profile' from source: play vars 22736 1727204267.78541: variable 'interface' from source: set_fact 22736 1727204267.78624: variable 'interface' from source: set_fact 22736 1727204267.78882: variable 'omit' from source: magic vars 22736 1727204267.78903: variable '__lsr_ansible_managed' from source: task vars 22736 1727204267.78993: variable '__lsr_ansible_managed' from source: task vars 22736 1727204267.79495: Loaded config def from plugin (lookup/template) 22736 1727204267.79498: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 22736 1727204267.79526: File lookup term: get_ansible_managed.j2 22736 1727204267.79535: variable 'ansible_search_path' from source: unknown 22736 1727204267.79553: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 22736 1727204267.79656: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 22736 1727204267.79660: variable 'ansible_search_path' from source: unknown 22736 1727204267.91764: variable 'ansible_managed' from source: unknown 22736 1727204267.92032: variable 'omit' from source: magic vars 22736 1727204267.92073: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204267.92114: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204267.92146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204267.92174: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204267.92195: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204267.92234: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204267.92252: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204267.92264: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204267.92395: Set connection var ansible_timeout to 10 22736 1727204267.92464: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204267.92467: Set connection var ansible_shell_executable to /bin/sh 22736 1727204267.92470: Set connection var ansible_shell_type to sh 22736 1727204267.92472: Set connection var ansible_pipelining to False 22736 1727204267.92475: Set connection var ansible_connection to ssh 22736 1727204267.92495: variable 'ansible_shell_executable' from source: unknown 22736 1727204267.92504: variable 'ansible_connection' from source: unknown 22736 1727204267.92513: variable 'ansible_module_compression' from source: unknown 22736 1727204267.92522: variable 'ansible_shell_type' from source: unknown 22736 1727204267.92530: variable 'ansible_shell_executable' from source: unknown 22736 1727204267.92538: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204267.92572: variable 'ansible_pipelining' from source: unknown 22736 1727204267.92575: variable 'ansible_timeout' from source: unknown 22736 1727204267.92578: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204267.92736: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204267.92791: variable 'omit' from source: magic vars 22736 1727204267.92794: starting attempt loop 22736 1727204267.92799: running the handler 22736 1727204267.92807: _low_level_execute_command(): starting 22736 1727204267.92819: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204267.93672: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204267.93730: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204267.93787: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204267.95564: stdout chunk (state=3): >>>/root <<< 22736 1727204267.95772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204267.95776: stdout chunk (state=3): >>><<< 22736 1727204267.95779: stderr chunk (state=3): >>><<< 22736 1727204267.95803: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204267.95824: _low_level_execute_command(): starting 22736 1727204267.95836: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204267.9581053-24401-89723440673137 `" && echo ansible-tmp-1727204267.9581053-24401-89723440673137="` echo /root/.ansible/tmp/ansible-tmp-1727204267.9581053-24401-89723440673137 `" ) && sleep 0' 22736 1727204267.96594: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22736 1727204267.96611: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204267.96628: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204267.96725: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204267.98745: stdout chunk (state=3): >>>ansible-tmp-1727204267.9581053-24401-89723440673137=/root/.ansible/tmp/ansible-tmp-1727204267.9581053-24401-89723440673137 <<< 22736 1727204267.98866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204267.98954: stderr chunk (state=3): >>><<< 22736 1727204267.98968: stdout chunk (state=3): >>><<< 22736 1727204267.99006: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204267.9581053-24401-89723440673137=/root/.ansible/tmp/ansible-tmp-1727204267.9581053-24401-89723440673137 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204267.99068: variable 'ansible_module_compression' from source: unknown 22736 1727204267.99131: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 22736 1727204267.99192: variable 'ansible_facts' from source: unknown 22736 1727204267.99351: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204267.9581053-24401-89723440673137/AnsiballZ_network_connections.py 22736 1727204267.99632: Sending initial data 22736 1727204267.99636: Sent initial data (167 bytes) 22736 1727204268.00220: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204268.00322: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204268.00345: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204268.00373: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204268.00422: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204268.02083: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204268.02125: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204268.02168: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpxg2_et9l /root/.ansible/tmp/ansible-tmp-1727204267.9581053-24401-89723440673137/AnsiballZ_network_connections.py <<< 22736 1727204268.02171: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204267.9581053-24401-89723440673137/AnsiballZ_network_connections.py" <<< 22736 1727204268.02231: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpxg2_et9l" to remote "/root/.ansible/tmp/ansible-tmp-1727204267.9581053-24401-89723440673137/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204267.9581053-24401-89723440673137/AnsiballZ_network_connections.py" <<< 22736 1727204268.03825: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204268.03860: stderr chunk (state=3): >>><<< 22736 1727204268.03870: stdout chunk (state=3): >>><<< 22736 1727204268.03903: done transferring module to remote 22736 1727204268.03922: _low_level_execute_command(): starting 22736 1727204268.04014: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204267.9581053-24401-89723440673137/ /root/.ansible/tmp/ansible-tmp-1727204267.9581053-24401-89723440673137/AnsiballZ_network_connections.py && sleep 0' 22736 1727204268.04587: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204268.04606: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204268.04620: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204268.04749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204268.04778: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204268.04840: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204268.06996: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204268.07000: stderr chunk (state=3): >>><<< 22736 1727204268.07003: stdout chunk (state=3): >>><<< 22736 1727204268.07006: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204268.07008: _low_level_execute_command(): starting 22736 1727204268.07011: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204267.9581053-24401-89723440673137/AnsiballZ_network_connections.py && sleep 0' 22736 1727204268.07675: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204268.07803: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204268.07831: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204268.07906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204268.44519: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 22736 1727204268.46298: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204268.46587: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 22736 1727204268.46594: stdout chunk (state=3): >>><<< 22736 1727204268.46597: stderr chunk (state=3): >>><<< 22736 1727204268.46604: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204268.46608: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204267.9581053-24401-89723440673137/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204268.46611: _low_level_execute_command(): starting 22736 1727204268.46616: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204267.9581053-24401-89723440673137/ > /dev/null 2>&1 && sleep 0' 22736 1727204268.47748: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204268.47810: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204268.47953: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204268.47966: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204268.48054: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204268.50082: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204268.50157: stderr chunk (state=3): >>><<< 22736 1727204268.50295: stdout chunk (state=3): >>><<< 22736 1727204268.50299: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204268.50302: handler run complete 22736 1727204268.50337: attempt loop complete, returning result 22736 1727204268.50598: _execute() done 22736 1727204268.50601: dumping result to json 22736 1727204268.50604: done dumping result, returning 22736 1727204268.50606: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-4f4a-548a-00000000004c] 22736 1727204268.50608: sending task result for task 12b410aa-8751-4f4a-548a-00000000004c 22736 1727204268.50692: done sending task result for task 12b410aa-8751-4f4a-548a-00000000004c 22736 1727204268.50696: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 22736 1727204268.51044: no more pending results, returning what we have 22736 1727204268.51048: results queue empty 22736 1727204268.51050: checking for any_errors_fatal 22736 1727204268.51057: done checking for any_errors_fatal 22736 1727204268.51058: checking for max_fail_percentage 22736 1727204268.51060: done checking for max_fail_percentage 22736 1727204268.51061: checking to see if all hosts have failed and the running result is not ok 22736 1727204268.51062: done checking to see if all hosts have failed 22736 1727204268.51063: getting the remaining hosts for this loop 22736 1727204268.51065: done getting the remaining hosts for this loop 22736 1727204268.51070: getting the next task for host managed-node2 22736 1727204268.51077: done getting next task for host managed-node2 22736 1727204268.51082: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 22736 1727204268.51084: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204268.51305: getting variables 22736 1727204268.51308: in VariableManager get_vars() 22736 1727204268.51354: Calling all_inventory to load vars for managed-node2 22736 1727204268.51357: Calling groups_inventory to load vars for managed-node2 22736 1727204268.51361: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204268.51373: Calling all_plugins_play to load vars for managed-node2 22736 1727204268.51377: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204268.51382: Calling groups_plugins_play to load vars for managed-node2 22736 1727204268.55801: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204268.62142: done with get_vars() 22736 1727204268.62179: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.889) 0:00:33.408 ***** 22736 1727204268.62498: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 22736 1727204268.63218: worker is 1 (out of 1 available) 22736 1727204268.63234: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 22736 1727204268.63249: done queuing things up, now waiting for results queue to drain 22736 1727204268.63250: waiting for pending results... 22736 1727204268.63701: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 22736 1727204268.64198: in run() - task 12b410aa-8751-4f4a-548a-00000000004d 22736 1727204268.64202: variable 'ansible_search_path' from source: unknown 22736 1727204268.64206: variable 'ansible_search_path' from source: unknown 22736 1727204268.64209: calling self._execute() 22736 1727204268.64215: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204268.64595: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204268.64599: variable 'omit' from source: magic vars 22736 1727204268.65262: variable 'ansible_distribution_major_version' from source: facts 22736 1727204268.65594: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204268.65683: variable 'network_state' from source: role '' defaults 22736 1727204268.65706: Evaluated conditional (network_state != {}): False 22736 1727204268.65809: when evaluation is False, skipping this task 22736 1727204268.65822: _execute() done 22736 1727204268.65832: dumping result to json 22736 1727204268.65840: done dumping result, returning 22736 1727204268.65852: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-4f4a-548a-00000000004d] 22736 1727204268.65862: sending task result for task 12b410aa-8751-4f4a-548a-00000000004d 22736 1727204268.65979: done sending task result for task 12b410aa-8751-4f4a-548a-00000000004d 22736 1727204268.65988: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22736 1727204268.66059: no more pending results, returning what we have 22736 1727204268.66064: results queue empty 22736 1727204268.66065: checking for any_errors_fatal 22736 1727204268.66075: done checking for any_errors_fatal 22736 1727204268.66076: checking for max_fail_percentage 22736 1727204268.66078: done checking for max_fail_percentage 22736 1727204268.66079: checking to see if all hosts have failed and the running result is not ok 22736 1727204268.66080: done checking to see if all hosts have failed 22736 1727204268.66081: getting the remaining hosts for this loop 22736 1727204268.66082: done getting the remaining hosts for this loop 22736 1727204268.66087: getting the next task for host managed-node2 22736 1727204268.66096: done getting next task for host managed-node2 22736 1727204268.66101: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22736 1727204268.66104: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204268.66122: getting variables 22736 1727204268.66124: in VariableManager get_vars() 22736 1727204268.66165: Calling all_inventory to load vars for managed-node2 22736 1727204268.66168: Calling groups_inventory to load vars for managed-node2 22736 1727204268.66170: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204268.66184: Calling all_plugins_play to load vars for managed-node2 22736 1727204268.66188: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204268.66411: Calling groups_plugins_play to load vars for managed-node2 22736 1727204268.70808: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204268.76988: done with get_vars() 22736 1727204268.77045: done getting variables 22736 1727204268.77118: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.147) 0:00:33.556 ***** 22736 1727204268.77165: entering _queue_task() for managed-node2/debug 22736 1727204268.77556: worker is 1 (out of 1 available) 22736 1727204268.77569: exiting _queue_task() for managed-node2/debug 22736 1727204268.77699: done queuing things up, now waiting for results queue to drain 22736 1727204268.77701: waiting for pending results... 22736 1727204268.77908: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22736 1727204268.78040: in run() - task 12b410aa-8751-4f4a-548a-00000000004e 22736 1727204268.78065: variable 'ansible_search_path' from source: unknown 22736 1727204268.78073: variable 'ansible_search_path' from source: unknown 22736 1727204268.78125: calling self._execute() 22736 1727204268.78249: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204268.78265: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204268.78282: variable 'omit' from source: magic vars 22736 1727204268.78779: variable 'ansible_distribution_major_version' from source: facts 22736 1727204268.78786: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204268.78888: variable 'omit' from source: magic vars 22736 1727204268.78893: variable 'omit' from source: magic vars 22736 1727204268.78921: variable 'omit' from source: magic vars 22736 1727204268.78972: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204268.79027: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204268.79058: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204268.79085: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204268.79109: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204268.79156: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204268.79165: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204268.79174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204268.79312: Set connection var ansible_timeout to 10 22736 1727204268.79339: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204268.79356: Set connection var ansible_shell_executable to /bin/sh 22736 1727204268.79367: Set connection var ansible_shell_type to sh 22736 1727204268.79379: Set connection var ansible_pipelining to False 22736 1727204268.79433: Set connection var ansible_connection to ssh 22736 1727204268.79437: variable 'ansible_shell_executable' from source: unknown 22736 1727204268.79439: variable 'ansible_connection' from source: unknown 22736 1727204268.79443: variable 'ansible_module_compression' from source: unknown 22736 1727204268.79445: variable 'ansible_shell_type' from source: unknown 22736 1727204268.79449: variable 'ansible_shell_executable' from source: unknown 22736 1727204268.79457: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204268.79470: variable 'ansible_pipelining' from source: unknown 22736 1727204268.79479: variable 'ansible_timeout' from source: unknown 22736 1727204268.79488: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204268.79673: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204268.79761: variable 'omit' from source: magic vars 22736 1727204268.79764: starting attempt loop 22736 1727204268.79767: running the handler 22736 1727204268.79887: variable '__network_connections_result' from source: set_fact 22736 1727204268.79955: handler run complete 22736 1727204268.79995: attempt loop complete, returning result 22736 1727204268.80003: _execute() done 22736 1727204268.80012: dumping result to json 22736 1727204268.80021: done dumping result, returning 22736 1727204268.80035: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-4f4a-548a-00000000004e] 22736 1727204268.80044: sending task result for task 12b410aa-8751-4f4a-548a-00000000004e ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 22736 1727204268.80366: no more pending results, returning what we have 22736 1727204268.80371: results queue empty 22736 1727204268.80373: checking for any_errors_fatal 22736 1727204268.80381: done checking for any_errors_fatal 22736 1727204268.80382: checking for max_fail_percentage 22736 1727204268.80384: done checking for max_fail_percentage 22736 1727204268.80385: checking to see if all hosts have failed and the running result is not ok 22736 1727204268.80386: done checking to see if all hosts have failed 22736 1727204268.80387: getting the remaining hosts for this loop 22736 1727204268.80392: done getting the remaining hosts for this loop 22736 1727204268.80398: getting the next task for host managed-node2 22736 1727204268.80405: done getting next task for host managed-node2 22736 1727204268.80410: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22736 1727204268.80413: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204268.80426: getting variables 22736 1727204268.80428: in VariableManager get_vars() 22736 1727204268.80472: Calling all_inventory to load vars for managed-node2 22736 1727204268.80476: Calling groups_inventory to load vars for managed-node2 22736 1727204268.80479: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204268.80609: Calling all_plugins_play to load vars for managed-node2 22736 1727204268.80614: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204268.80621: done sending task result for task 12b410aa-8751-4f4a-548a-00000000004e 22736 1727204268.80624: WORKER PROCESS EXITING 22736 1727204268.80629: Calling groups_plugins_play to load vars for managed-node2 22736 1727204268.83364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204268.86350: done with get_vars() 22736 1727204268.86404: done getting variables 22736 1727204268.86473: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:48 -0400 (0:00:00.093) 0:00:33.650 ***** 22736 1727204268.86519: entering _queue_task() for managed-node2/debug 22736 1727204268.86952: worker is 1 (out of 1 available) 22736 1727204268.86967: exiting _queue_task() for managed-node2/debug 22736 1727204268.86982: done queuing things up, now waiting for results queue to drain 22736 1727204268.86983: waiting for pending results... 22736 1727204268.87267: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22736 1727204268.87406: in run() - task 12b410aa-8751-4f4a-548a-00000000004f 22736 1727204268.87430: variable 'ansible_search_path' from source: unknown 22736 1727204268.87438: variable 'ansible_search_path' from source: unknown 22736 1727204268.87483: calling self._execute() 22736 1727204268.87610: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204268.87627: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204268.87644: variable 'omit' from source: magic vars 22736 1727204268.88117: variable 'ansible_distribution_major_version' from source: facts 22736 1727204268.88137: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204268.88155: variable 'omit' from source: magic vars 22736 1727204268.88211: variable 'omit' from source: magic vars 22736 1727204268.88270: variable 'omit' from source: magic vars 22736 1727204268.88321: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204268.88481: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204268.88485: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204268.88488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204268.88493: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204268.88496: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204268.88499: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204268.88510: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204268.88646: Set connection var ansible_timeout to 10 22736 1727204268.88668: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204268.88684: Set connection var ansible_shell_executable to /bin/sh 22736 1727204268.88700: Set connection var ansible_shell_type to sh 22736 1727204268.88713: Set connection var ansible_pipelining to False 22736 1727204268.88726: Set connection var ansible_connection to ssh 22736 1727204268.88758: variable 'ansible_shell_executable' from source: unknown 22736 1727204268.88768: variable 'ansible_connection' from source: unknown 22736 1727204268.88778: variable 'ansible_module_compression' from source: unknown 22736 1727204268.88787: variable 'ansible_shell_type' from source: unknown 22736 1727204268.88807: variable 'ansible_shell_executable' from source: unknown 22736 1727204268.88810: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204268.88895: variable 'ansible_pipelining' from source: unknown 22736 1727204268.88899: variable 'ansible_timeout' from source: unknown 22736 1727204268.88902: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204268.89034: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204268.89058: variable 'omit' from source: magic vars 22736 1727204268.89070: starting attempt loop 22736 1727204268.89079: running the handler 22736 1727204268.89148: variable '__network_connections_result' from source: set_fact 22736 1727204268.89269: variable '__network_connections_result' from source: set_fact 22736 1727204268.89420: handler run complete 22736 1727204268.89466: attempt loop complete, returning result 22736 1727204268.89475: _execute() done 22736 1727204268.89568: dumping result to json 22736 1727204268.89571: done dumping result, returning 22736 1727204268.89575: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-4f4a-548a-00000000004f] 22736 1727204268.89577: sending task result for task 12b410aa-8751-4f4a-548a-00000000004f 22736 1727204268.89655: done sending task result for task 12b410aa-8751-4f4a-548a-00000000004f 22736 1727204268.89659: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 22736 1727204268.89767: no more pending results, returning what we have 22736 1727204268.89897: results queue empty 22736 1727204268.89899: checking for any_errors_fatal 22736 1727204268.89907: done checking for any_errors_fatal 22736 1727204268.89908: checking for max_fail_percentage 22736 1727204268.89910: done checking for max_fail_percentage 22736 1727204268.89911: checking to see if all hosts have failed and the running result is not ok 22736 1727204268.89912: done checking to see if all hosts have failed 22736 1727204268.89913: getting the remaining hosts for this loop 22736 1727204268.89915: done getting the remaining hosts for this loop 22736 1727204268.89920: getting the next task for host managed-node2 22736 1727204268.89927: done getting next task for host managed-node2 22736 1727204268.89932: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22736 1727204268.89935: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204268.89949: getting variables 22736 1727204268.89951: in VariableManager get_vars() 22736 1727204268.90104: Calling all_inventory to load vars for managed-node2 22736 1727204268.90108: Calling groups_inventory to load vars for managed-node2 22736 1727204268.90111: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204268.90123: Calling all_plugins_play to load vars for managed-node2 22736 1727204268.90127: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204268.90131: Calling groups_plugins_play to load vars for managed-node2 22736 1727204268.92483: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204269.00427: done with get_vars() 22736 1727204269.00469: done getting variables 22736 1727204269.00538: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.140) 0:00:33.790 ***** 22736 1727204269.00595: entering _queue_task() for managed-node2/debug 22736 1727204269.01610: worker is 1 (out of 1 available) 22736 1727204269.01621: exiting _queue_task() for managed-node2/debug 22736 1727204269.01632: done queuing things up, now waiting for results queue to drain 22736 1727204269.01634: waiting for pending results... 22736 1727204269.02032: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22736 1727204269.02301: in run() - task 12b410aa-8751-4f4a-548a-000000000050 22736 1727204269.02306: variable 'ansible_search_path' from source: unknown 22736 1727204269.02309: variable 'ansible_search_path' from source: unknown 22736 1727204269.02312: calling self._execute() 22736 1727204269.02636: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204269.02671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204269.02675: variable 'omit' from source: magic vars 22736 1727204269.03521: variable 'ansible_distribution_major_version' from source: facts 22736 1727204269.03696: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204269.03700: variable 'network_state' from source: role '' defaults 22736 1727204269.03725: Evaluated conditional (network_state != {}): False 22736 1727204269.03729: when evaluation is False, skipping this task 22736 1727204269.03732: _execute() done 22736 1727204269.03734: dumping result to json 22736 1727204269.03740: done dumping result, returning 22736 1727204269.03751: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-4f4a-548a-000000000050] 22736 1727204269.03756: sending task result for task 12b410aa-8751-4f4a-548a-000000000050 skipping: [managed-node2] => { "false_condition": "network_state != {}" } 22736 1727204269.04080: no more pending results, returning what we have 22736 1727204269.04084: results queue empty 22736 1727204269.04085: checking for any_errors_fatal 22736 1727204269.04095: done checking for any_errors_fatal 22736 1727204269.04097: checking for max_fail_percentage 22736 1727204269.04099: done checking for max_fail_percentage 22736 1727204269.04100: checking to see if all hosts have failed and the running result is not ok 22736 1727204269.04101: done checking to see if all hosts have failed 22736 1727204269.04101: getting the remaining hosts for this loop 22736 1727204269.04103: done getting the remaining hosts for this loop 22736 1727204269.04107: getting the next task for host managed-node2 22736 1727204269.04112: done getting next task for host managed-node2 22736 1727204269.04116: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 22736 1727204269.04118: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204269.04133: getting variables 22736 1727204269.04135: in VariableManager get_vars() 22736 1727204269.04211: Calling all_inventory to load vars for managed-node2 22736 1727204269.04215: Calling groups_inventory to load vars for managed-node2 22736 1727204269.04218: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204269.04259: done sending task result for task 12b410aa-8751-4f4a-548a-000000000050 22736 1727204269.04262: WORKER PROCESS EXITING 22736 1727204269.04275: Calling all_plugins_play to load vars for managed-node2 22736 1727204269.04279: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204269.04283: Calling groups_plugins_play to load vars for managed-node2 22736 1727204269.07777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204269.11644: done with get_vars() 22736 1727204269.11693: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:49 -0400 (0:00:00.112) 0:00:33.903 ***** 22736 1727204269.11811: entering _queue_task() for managed-node2/ping 22736 1727204269.12303: worker is 1 (out of 1 available) 22736 1727204269.12315: exiting _queue_task() for managed-node2/ping 22736 1727204269.12445: done queuing things up, now waiting for results queue to drain 22736 1727204269.12448: waiting for pending results... 22736 1727204269.12679: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 22736 1727204269.12885: in run() - task 12b410aa-8751-4f4a-548a-000000000051 22736 1727204269.12891: variable 'ansible_search_path' from source: unknown 22736 1727204269.12895: variable 'ansible_search_path' from source: unknown 22736 1727204269.12923: calling self._execute() 22736 1727204269.13037: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204269.13052: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204269.13068: variable 'omit' from source: magic vars 22736 1727204269.13569: variable 'ansible_distribution_major_version' from source: facts 22736 1727204269.13587: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204269.13616: variable 'omit' from source: magic vars 22736 1727204269.13669: variable 'omit' from source: magic vars 22736 1727204269.13753: variable 'omit' from source: magic vars 22736 1727204269.13775: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204269.13829: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204269.13870: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204269.13947: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204269.13950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204269.13958: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204269.13971: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204269.13979: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204269.14120: Set connection var ansible_timeout to 10 22736 1727204269.14141: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204269.14155: Set connection var ansible_shell_executable to /bin/sh 22736 1727204269.14167: Set connection var ansible_shell_type to sh 22736 1727204269.14178: Set connection var ansible_pipelining to False 22736 1727204269.14295: Set connection var ansible_connection to ssh 22736 1727204269.14299: variable 'ansible_shell_executable' from source: unknown 22736 1727204269.14302: variable 'ansible_connection' from source: unknown 22736 1727204269.14304: variable 'ansible_module_compression' from source: unknown 22736 1727204269.14306: variable 'ansible_shell_type' from source: unknown 22736 1727204269.14309: variable 'ansible_shell_executable' from source: unknown 22736 1727204269.14311: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204269.14313: variable 'ansible_pipelining' from source: unknown 22736 1727204269.14315: variable 'ansible_timeout' from source: unknown 22736 1727204269.14318: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204269.14531: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204269.14572: variable 'omit' from source: magic vars 22736 1727204269.14582: starting attempt loop 22736 1727204269.14594: running the handler 22736 1727204269.14618: _low_level_execute_command(): starting 22736 1727204269.14633: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204269.15499: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204269.15519: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204269.15574: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204269.15605: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204269.15628: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204269.15778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204269.17605: stdout chunk (state=3): >>>/root <<< 22736 1727204269.17819: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204269.18125: stderr chunk (state=3): >>><<< 22736 1727204269.18129: stdout chunk (state=3): >>><<< 22736 1727204269.18133: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204269.18135: _low_level_execute_command(): starting 22736 1727204269.18139: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204269.180656-24523-242391926377954 `" && echo ansible-tmp-1727204269.180656-24523-242391926377954="` echo /root/.ansible/tmp/ansible-tmp-1727204269.180656-24523-242391926377954 `" ) && sleep 0' 22736 1727204269.19386: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204269.19403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204269.19419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204269.19436: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204269.19495: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204269.19555: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204269.19666: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204269.19675: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204269.19810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204269.21899: stdout chunk (state=3): >>>ansible-tmp-1727204269.180656-24523-242391926377954=/root/.ansible/tmp/ansible-tmp-1727204269.180656-24523-242391926377954 <<< 22736 1727204269.22011: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204269.22219: stderr chunk (state=3): >>><<< 22736 1727204269.22223: stdout chunk (state=3): >>><<< 22736 1727204269.22280: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204269.180656-24523-242391926377954=/root/.ansible/tmp/ansible-tmp-1727204269.180656-24523-242391926377954 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204269.22308: variable 'ansible_module_compression' from source: unknown 22736 1727204269.22444: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 22736 1727204269.22498: variable 'ansible_facts' from source: unknown 22736 1727204269.22995: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204269.180656-24523-242391926377954/AnsiballZ_ping.py 22736 1727204269.23056: Sending initial data 22736 1727204269.23060: Sent initial data (152 bytes) 22736 1727204269.24481: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204269.24505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204269.24587: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204269.24621: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204269.24702: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204269.24715: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204269.24792: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204269.26555: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204269.26579: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204269.26661: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpv1dbwirp /root/.ansible/tmp/ansible-tmp-1727204269.180656-24523-242391926377954/AnsiballZ_ping.py <<< 22736 1727204269.26665: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204269.180656-24523-242391926377954/AnsiballZ_ping.py" <<< 22736 1727204269.26721: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpv1dbwirp" to remote "/root/.ansible/tmp/ansible-tmp-1727204269.180656-24523-242391926377954/AnsiballZ_ping.py" <<< 22736 1727204269.26730: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204269.180656-24523-242391926377954/AnsiballZ_ping.py" <<< 22736 1727204269.28847: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204269.28906: stderr chunk (state=3): >>><<< 22736 1727204269.28910: stdout chunk (state=3): >>><<< 22736 1727204269.28985: done transferring module to remote 22736 1727204269.28988: _low_level_execute_command(): starting 22736 1727204269.28994: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204269.180656-24523-242391926377954/ /root/.ansible/tmp/ansible-tmp-1727204269.180656-24523-242391926377954/AnsiballZ_ping.py && sleep 0' 22736 1727204269.30399: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204269.30404: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204269.30407: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204269.30410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204269.30608: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204269.30699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204269.32828: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204269.32855: stderr chunk (state=3): >>><<< 22736 1727204269.32863: stdout chunk (state=3): >>><<< 22736 1727204269.32919: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204269.32923: _low_level_execute_command(): starting 22736 1727204269.32927: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204269.180656-24523-242391926377954/AnsiballZ_ping.py && sleep 0' 22736 1727204269.34379: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204269.34384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204269.34387: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204269.34391: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204269.34393: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204269.34521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204269.34534: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204269.34638: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204269.34642: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204269.52123: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 22736 1727204269.53901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204269.53906: stderr chunk (state=3): >>><<< 22736 1727204269.53912: stdout chunk (state=3): >>><<< 22736 1727204269.54002: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204269.54029: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204269.180656-24523-242391926377954/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204269.54041: _low_level_execute_command(): starting 22736 1727204269.54048: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204269.180656-24523-242391926377954/ > /dev/null 2>&1 && sleep 0' 22736 1727204269.55622: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204269.55724: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204269.55728: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204269.55848: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204269.55919: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204269.58097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204269.58123: stderr chunk (state=3): >>><<< 22736 1727204269.58126: stdout chunk (state=3): >>><<< 22736 1727204269.58210: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204269.58219: handler run complete 22736 1727204269.58240: attempt loop complete, returning result 22736 1727204269.58243: _execute() done 22736 1727204269.58248: dumping result to json 22736 1727204269.58254: done dumping result, returning 22736 1727204269.58266: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-4f4a-548a-000000000051] 22736 1727204269.58269: sending task result for task 12b410aa-8751-4f4a-548a-000000000051 22736 1727204269.58521: done sending task result for task 12b410aa-8751-4f4a-548a-000000000051 22736 1727204269.58524: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 22736 1727204269.58594: no more pending results, returning what we have 22736 1727204269.58598: results queue empty 22736 1727204269.58599: checking for any_errors_fatal 22736 1727204269.58607: done checking for any_errors_fatal 22736 1727204269.58608: checking for max_fail_percentage 22736 1727204269.58609: done checking for max_fail_percentage 22736 1727204269.58610: checking to see if all hosts have failed and the running result is not ok 22736 1727204269.58611: done checking to see if all hosts have failed 22736 1727204269.58612: getting the remaining hosts for this loop 22736 1727204269.58616: done getting the remaining hosts for this loop 22736 1727204269.58621: getting the next task for host managed-node2 22736 1727204269.58629: done getting next task for host managed-node2 22736 1727204269.58632: ^ task is: TASK: meta (role_complete) 22736 1727204269.58634: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204269.58646: getting variables 22736 1727204269.58648: in VariableManager get_vars() 22736 1727204269.58910: Calling all_inventory to load vars for managed-node2 22736 1727204269.58916: Calling groups_inventory to load vars for managed-node2 22736 1727204269.58920: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204269.58933: Calling all_plugins_play to load vars for managed-node2 22736 1727204269.58937: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204269.58941: Calling groups_plugins_play to load vars for managed-node2 22736 1727204269.61959: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204269.65542: done with get_vars() 22736 1727204269.65590: done getting variables 22736 1727204269.65701: done queuing things up, now waiting for results queue to drain 22736 1727204269.65703: results queue empty 22736 1727204269.65704: checking for any_errors_fatal 22736 1727204269.65708: done checking for any_errors_fatal 22736 1727204269.65709: checking for max_fail_percentage 22736 1727204269.65711: done checking for max_fail_percentage 22736 1727204269.65712: checking to see if all hosts have failed and the running result is not ok 22736 1727204269.65715: done checking to see if all hosts have failed 22736 1727204269.65716: getting the remaining hosts for this loop 22736 1727204269.65718: done getting the remaining hosts for this loop 22736 1727204269.65721: getting the next task for host managed-node2 22736 1727204269.65725: done getting next task for host managed-node2 22736 1727204269.65727: ^ task is: TASK: meta (flush_handlers) 22736 1727204269.65729: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204269.65733: getting variables 22736 1727204269.65734: in VariableManager get_vars() 22736 1727204269.65749: Calling all_inventory to load vars for managed-node2 22736 1727204269.65752: Calling groups_inventory to load vars for managed-node2 22736 1727204269.65755: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204269.65761: Calling all_plugins_play to load vars for managed-node2 22736 1727204269.65769: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204269.65773: Calling groups_plugins_play to load vars for managed-node2 22736 1727204269.68177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204269.73243: done with get_vars() 22736 1727204269.73279: done getting variables 22736 1727204269.73470: in VariableManager get_vars() 22736 1727204269.73487: Calling all_inventory to load vars for managed-node2 22736 1727204269.73550: Calling groups_inventory to load vars for managed-node2 22736 1727204269.73554: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204269.73561: Calling all_plugins_play to load vars for managed-node2 22736 1727204269.73565: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204269.73569: Calling groups_plugins_play to load vars for managed-node2 22736 1727204269.77446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204269.81202: done with get_vars() 22736 1727204269.81262: done queuing things up, now waiting for results queue to drain 22736 1727204269.81265: results queue empty 22736 1727204269.81266: checking for any_errors_fatal 22736 1727204269.81268: done checking for any_errors_fatal 22736 1727204269.81269: checking for max_fail_percentage 22736 1727204269.81271: done checking for max_fail_percentage 22736 1727204269.81272: checking to see if all hosts have failed and the running result is not ok 22736 1727204269.81273: done checking to see if all hosts have failed 22736 1727204269.81274: getting the remaining hosts for this loop 22736 1727204269.81275: done getting the remaining hosts for this loop 22736 1727204269.81279: getting the next task for host managed-node2 22736 1727204269.81284: done getting next task for host managed-node2 22736 1727204269.81286: ^ task is: TASK: meta (flush_handlers) 22736 1727204269.81288: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204269.81294: getting variables 22736 1727204269.81295: in VariableManager get_vars() 22736 1727204269.81316: Calling all_inventory to load vars for managed-node2 22736 1727204269.81320: Calling groups_inventory to load vars for managed-node2 22736 1727204269.81323: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204269.81330: Calling all_plugins_play to load vars for managed-node2 22736 1727204269.81335: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204269.81338: Calling groups_plugins_play to load vars for managed-node2 22736 1727204269.84180: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204269.88356: done with get_vars() 22736 1727204269.88406: done getting variables 22736 1727204269.88478: in VariableManager get_vars() 22736 1727204269.88497: Calling all_inventory to load vars for managed-node2 22736 1727204269.88500: Calling groups_inventory to load vars for managed-node2 22736 1727204269.88504: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204269.88517: Calling all_plugins_play to load vars for managed-node2 22736 1727204269.88521: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204269.88525: Calling groups_plugins_play to load vars for managed-node2 22736 1727204269.90719: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204269.93923: done with get_vars() 22736 1727204269.93970: done queuing things up, now waiting for results queue to drain 22736 1727204269.93973: results queue empty 22736 1727204269.93974: checking for any_errors_fatal 22736 1727204269.93976: done checking for any_errors_fatal 22736 1727204269.93977: checking for max_fail_percentage 22736 1727204269.93978: done checking for max_fail_percentage 22736 1727204269.93979: checking to see if all hosts have failed and the running result is not ok 22736 1727204269.93980: done checking to see if all hosts have failed 22736 1727204269.93981: getting the remaining hosts for this loop 22736 1727204269.93982: done getting the remaining hosts for this loop 22736 1727204269.93985: getting the next task for host managed-node2 22736 1727204269.93992: done getting next task for host managed-node2 22736 1727204269.93993: ^ task is: None 22736 1727204269.93995: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204269.93996: done queuing things up, now waiting for results queue to drain 22736 1727204269.93997: results queue empty 22736 1727204269.93998: checking for any_errors_fatal 22736 1727204269.93999: done checking for any_errors_fatal 22736 1727204269.94000: checking for max_fail_percentage 22736 1727204269.94002: done checking for max_fail_percentage 22736 1727204269.94002: checking to see if all hosts have failed and the running result is not ok 22736 1727204269.94003: done checking to see if all hosts have failed 22736 1727204269.94005: getting the next task for host managed-node2 22736 1727204269.94008: done getting next task for host managed-node2 22736 1727204269.94009: ^ task is: None 22736 1727204269.94010: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204269.94068: in VariableManager get_vars() 22736 1727204269.94094: done with get_vars() 22736 1727204269.94103: in VariableManager get_vars() 22736 1727204269.94117: done with get_vars() 22736 1727204269.94123: variable 'omit' from source: magic vars 22736 1727204269.94167: in VariableManager get_vars() 22736 1727204269.94181: done with get_vars() 22736 1727204269.94211: variable 'omit' from source: magic vars PLAY [Delete the interface] **************************************************** 22736 1727204269.94447: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22736 1727204269.94577: getting the remaining hosts for this loop 22736 1727204269.94579: done getting the remaining hosts for this loop 22736 1727204269.94582: getting the next task for host managed-node2 22736 1727204269.94586: done getting next task for host managed-node2 22736 1727204269.94590: ^ task is: TASK: Gathering Facts 22736 1727204269.94592: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204269.94595: getting variables 22736 1727204269.94596: in VariableManager get_vars() 22736 1727204269.94606: Calling all_inventory to load vars for managed-node2 22736 1727204269.94609: Calling groups_inventory to load vars for managed-node2 22736 1727204269.94612: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204269.94620: Calling all_plugins_play to load vars for managed-node2 22736 1727204269.94625: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204269.94633: Calling groups_plugins_play to load vars for managed-node2 22736 1727204269.96777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204270.00060: done with get_vars() 22736 1727204270.00104: done getting variables 22736 1727204270.00173: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Tuesday 24 September 2024 14:57:50 -0400 (0:00:00.883) 0:00:34.786 ***** 22736 1727204270.00208: entering _queue_task() for managed-node2/gather_facts 22736 1727204270.00719: worker is 1 (out of 1 available) 22736 1727204270.00732: exiting _queue_task() for managed-node2/gather_facts 22736 1727204270.00745: done queuing things up, now waiting for results queue to drain 22736 1727204270.00747: waiting for pending results... 22736 1727204270.01114: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22736 1727204270.01131: in run() - task 12b410aa-8751-4f4a-548a-0000000003f8 22736 1727204270.01160: variable 'ansible_search_path' from source: unknown 22736 1727204270.01295: calling self._execute() 22736 1727204270.01354: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204270.01375: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204270.01394: variable 'omit' from source: magic vars 22736 1727204270.01919: variable 'ansible_distribution_major_version' from source: facts 22736 1727204270.01940: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204270.01953: variable 'omit' from source: magic vars 22736 1727204270.02001: variable 'omit' from source: magic vars 22736 1727204270.02049: variable 'omit' from source: magic vars 22736 1727204270.02106: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204270.02155: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204270.02200: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204270.02230: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204270.02294: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204270.02298: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204270.02318: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204270.02329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204270.02476: Set connection var ansible_timeout to 10 22736 1727204270.02501: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204270.02529: Set connection var ansible_shell_executable to /bin/sh 22736 1727204270.02541: Set connection var ansible_shell_type to sh 22736 1727204270.02633: Set connection var ansible_pipelining to False 22736 1727204270.02636: Set connection var ansible_connection to ssh 22736 1727204270.02639: variable 'ansible_shell_executable' from source: unknown 22736 1727204270.02641: variable 'ansible_connection' from source: unknown 22736 1727204270.02643: variable 'ansible_module_compression' from source: unknown 22736 1727204270.02645: variable 'ansible_shell_type' from source: unknown 22736 1727204270.02649: variable 'ansible_shell_executable' from source: unknown 22736 1727204270.02651: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204270.02653: variable 'ansible_pipelining' from source: unknown 22736 1727204270.02656: variable 'ansible_timeout' from source: unknown 22736 1727204270.02669: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204270.02937: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204270.02999: variable 'omit' from source: magic vars 22736 1727204270.03004: starting attempt loop 22736 1727204270.03007: running the handler 22736 1727204270.03018: variable 'ansible_facts' from source: unknown 22736 1727204270.03047: _low_level_execute_command(): starting 22736 1727204270.03098: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204270.04065: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204270.04094: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204270.04116: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204270.04236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204270.06052: stdout chunk (state=3): >>>/root <<< 22736 1727204270.06383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204270.06387: stdout chunk (state=3): >>><<< 22736 1727204270.06393: stderr chunk (state=3): >>><<< 22736 1727204270.06592: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204270.06599: _low_level_execute_command(): starting 22736 1727204270.06603: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204270.0645204-24550-176380016459475 `" && echo ansible-tmp-1727204270.0645204-24550-176380016459475="` echo /root/.ansible/tmp/ansible-tmp-1727204270.0645204-24550-176380016459475 `" ) && sleep 0' 22736 1727204270.07280: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204270.07302: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204270.07319: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204270.07338: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204270.07355: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204270.07455: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204270.07494: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204270.07563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204270.09682: stdout chunk (state=3): >>>ansible-tmp-1727204270.0645204-24550-176380016459475=/root/.ansible/tmp/ansible-tmp-1727204270.0645204-24550-176380016459475 <<< 22736 1727204270.10029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204270.10074: stderr chunk (state=3): >>><<< 22736 1727204270.10102: stdout chunk (state=3): >>><<< 22736 1727204270.10157: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204270.0645204-24550-176380016459475=/root/.ansible/tmp/ansible-tmp-1727204270.0645204-24550-176380016459475 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204270.10206: variable 'ansible_module_compression' from source: unknown 22736 1727204270.10314: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22736 1727204270.10459: variable 'ansible_facts' from source: unknown 22736 1727204270.10788: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204270.0645204-24550-176380016459475/AnsiballZ_setup.py 22736 1727204270.11346: Sending initial data 22736 1727204270.11392: Sent initial data (154 bytes) 22736 1727204270.12342: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204270.12369: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204270.12373: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204270.12420: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204270.12443: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204270.12487: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204270.14254: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204270.14292: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204270.14329: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpesxvs0l_ /root/.ansible/tmp/ansible-tmp-1727204270.0645204-24550-176380016459475/AnsiballZ_setup.py <<< 22736 1727204270.14332: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204270.0645204-24550-176380016459475/AnsiballZ_setup.py" <<< 22736 1727204270.14364: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpesxvs0l_" to remote "/root/.ansible/tmp/ansible-tmp-1727204270.0645204-24550-176380016459475/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204270.0645204-24550-176380016459475/AnsiballZ_setup.py" <<< 22736 1727204270.16375: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204270.16554: stderr chunk (state=3): >>><<< 22736 1727204270.16557: stdout chunk (state=3): >>><<< 22736 1727204270.16560: done transferring module to remote 22736 1727204270.16562: _low_level_execute_command(): starting 22736 1727204270.16565: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204270.0645204-24550-176380016459475/ /root/.ansible/tmp/ansible-tmp-1727204270.0645204-24550-176380016459475/AnsiballZ_setup.py && sleep 0' 22736 1727204270.17085: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204270.17138: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204270.17160: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204270.17254: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204270.19232: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204270.19293: stderr chunk (state=3): >>><<< 22736 1727204270.19297: stdout chunk (state=3): >>><<< 22736 1727204270.19317: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204270.19324: _low_level_execute_command(): starting 22736 1727204270.19328: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204270.0645204-24550-176380016459475/AnsiballZ_setup.py && sleep 0' 22736 1727204270.19936: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204270.19942: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204270.19945: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204270.19948: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204270.20007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204270.20015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204270.20069: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204270.90037: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "50", "epoch": "1727204270", "epoch_int": "1727204270", "date": "2024-09-24", "time": "14:57:50", "iso8601_micro": "2024-09-24T18:57:50.510565Z", "iso8601": "2024-09-24T18:57:50Z", "iso8601_basic": "20240924T145750510565", "iso8601_basic_short": "20240924T145750", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.8056640625, "5m": 0.6591796875, "15m": 0.4091796875}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2847, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 870, "free": 2847}, "nocache": {"free": 3478, "used": 239}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 774, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146866688, "block_size": 4096, "block_total": 64479564, "block_available": 61315153, "block_used": 3164411, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_hostnqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0", "peerlsr27", "lsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "1a:7c:88:07:95:bb", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::7807:358f:2c9b:b2cc", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "02:23:6d:58:c2:b5", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::23:6dff:fe58:c2b5", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8", "fe80::7807:358f:2c9b:b2cc", "fe80::23:6dff:fe58:c2b5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::23:6dff:fe58:c2b5", "fe80::4a44:1e77:128f:34e8", "fe80::7807:358f:2c9b:b2cc"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22736 1727204270.92116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204270.92168: stderr chunk (state=3): >>><<< 22736 1727204270.92172: stdout chunk (state=3): >>><<< 22736 1727204270.92200: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_fips": false, "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_is_chroot": false, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "50", "epoch": "1727204270", "epoch_int": "1727204270", "date": "2024-09-24", "time": "14:57:50", "iso8601_micro": "2024-09-24T18:57:50.510565Z", "iso8601": "2024-09-24T18:57:50Z", "iso8601_basic": "20240924T145750510565", "iso8601_basic_short": "20240924T145750", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_loadavg": {"1m": 0.8056640625, "5m": 0.6591796875, "15m": 0.4091796875}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_fibre_channel_wwn": [], "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_iscsi_iqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_local": {}, "ansible_lsb": {}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2847, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 870, "free": 2847}, "nocache": {"free": 3478, "used": 239}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 774, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146866688, "block_size": 4096, "block_total": 64479564, "block_available": 61315153, "block_used": 3164411, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_hostnqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0", "peerlsr27", "lsr27"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lsr27": {"device": "lsr27", "macaddress": "1a:7c:88:07:95:bb", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::7807:358f:2c9b:b2cc", "prefix": "64", "scope": "link"}]}, "ansible_peerlsr27": {"device": "peerlsr27", "macaddress": "02:23:6d:58:c2:b5", "mtu": 1500, "active": true, "type": "ether", "speed": 10000, "promisc": false, "ipv6": [{"address": "fe80::23:6dff:fe58:c2b5", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8", "fe80::7807:358f:2c9b:b2cc", "fe80::23:6dff:fe58:c2b5"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::23:6dff:fe58:c2b5", "fe80::4a44:1e77:128f:34e8", "fe80::7807:358f:2c9b:b2cc"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204270.92482: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204270.0645204-24550-176380016459475/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204270.92502: _low_level_execute_command(): starting 22736 1727204270.92508: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204270.0645204-24550-176380016459475/ > /dev/null 2>&1 && sleep 0' 22736 1727204270.92953: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204270.92957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204270.92960: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204270.92962: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204270.93009: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204270.93030: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204270.93063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204270.95124: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204270.95128: stdout chunk (state=3): >>><<< 22736 1727204270.95130: stderr chunk (state=3): >>><<< 22736 1727204270.95133: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204270.95143: handler run complete 22736 1727204270.95369: variable 'ansible_facts' from source: unknown 22736 1727204270.95411: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204270.95868: variable 'ansible_facts' from source: unknown 22736 1727204270.96002: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204270.96204: attempt loop complete, returning result 22736 1727204270.96215: _execute() done 22736 1727204270.96224: dumping result to json 22736 1727204270.96267: done dumping result, returning 22736 1727204270.96280: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-4f4a-548a-0000000003f8] 22736 1727204270.96288: sending task result for task 12b410aa-8751-4f4a-548a-0000000003f8 ok: [managed-node2] 22736 1727204270.97172: no more pending results, returning what we have 22736 1727204270.97177: results queue empty 22736 1727204270.97178: checking for any_errors_fatal 22736 1727204270.97179: done checking for any_errors_fatal 22736 1727204270.97180: checking for max_fail_percentage 22736 1727204270.97181: done checking for max_fail_percentage 22736 1727204270.97182: checking to see if all hosts have failed and the running result is not ok 22736 1727204270.97183: done checking to see if all hosts have failed 22736 1727204270.97183: getting the remaining hosts for this loop 22736 1727204270.97184: done getting the remaining hosts for this loop 22736 1727204270.97187: getting the next task for host managed-node2 22736 1727204270.97193: done getting next task for host managed-node2 22736 1727204270.97194: ^ task is: TASK: meta (flush_handlers) 22736 1727204270.97196: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204270.97199: getting variables 22736 1727204270.97201: in VariableManager get_vars() 22736 1727204270.97222: Calling all_inventory to load vars for managed-node2 22736 1727204270.97224: Calling groups_inventory to load vars for managed-node2 22736 1727204270.97227: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204270.97237: Calling all_plugins_play to load vars for managed-node2 22736 1727204270.97240: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204270.97243: Calling groups_plugins_play to load vars for managed-node2 22736 1727204270.97761: done sending task result for task 12b410aa-8751-4f4a-548a-0000000003f8 22736 1727204270.97764: WORKER PROCESS EXITING 22736 1727204270.98521: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204271.00585: done with get_vars() 22736 1727204271.00610: done getting variables 22736 1727204271.00670: in VariableManager get_vars() 22736 1727204271.00678: Calling all_inventory to load vars for managed-node2 22736 1727204271.00680: Calling groups_inventory to load vars for managed-node2 22736 1727204271.00682: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204271.00686: Calling all_plugins_play to load vars for managed-node2 22736 1727204271.00688: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204271.00692: Calling groups_plugins_play to load vars for managed-node2 22736 1727204271.01853: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204271.03436: done with get_vars() 22736 1727204271.03470: done queuing things up, now waiting for results queue to drain 22736 1727204271.03472: results queue empty 22736 1727204271.03473: checking for any_errors_fatal 22736 1727204271.03477: done checking for any_errors_fatal 22736 1727204271.03478: checking for max_fail_percentage 22736 1727204271.03478: done checking for max_fail_percentage 22736 1727204271.03479: checking to see if all hosts have failed and the running result is not ok 22736 1727204271.03484: done checking to see if all hosts have failed 22736 1727204271.03485: getting the remaining hosts for this loop 22736 1727204271.03485: done getting the remaining hosts for this loop 22736 1727204271.03488: getting the next task for host managed-node2 22736 1727204271.03493: done getting next task for host managed-node2 22736 1727204271.03495: ^ task is: TASK: Include the task 'delete_interface.yml' 22736 1727204271.03497: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204271.03499: getting variables 22736 1727204271.03499: in VariableManager get_vars() 22736 1727204271.03507: Calling all_inventory to load vars for managed-node2 22736 1727204271.03509: Calling groups_inventory to load vars for managed-node2 22736 1727204271.03511: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204271.03516: Calling all_plugins_play to load vars for managed-node2 22736 1727204271.03518: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204271.03521: Calling groups_plugins_play to load vars for managed-node2 22736 1727204271.04969: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204271.06802: done with get_vars() 22736 1727204271.06835: done getting variables TASK [Include the task 'delete_interface.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:8 Tuesday 24 September 2024 14:57:51 -0400 (0:00:01.067) 0:00:35.854 ***** 22736 1727204271.06940: entering _queue_task() for managed-node2/include_tasks 22736 1727204271.07251: worker is 1 (out of 1 available) 22736 1727204271.07265: exiting _queue_task() for managed-node2/include_tasks 22736 1727204271.07283: done queuing things up, now waiting for results queue to drain 22736 1727204271.07286: waiting for pending results... 22736 1727204271.07635: running TaskExecutor() for managed-node2/TASK: Include the task 'delete_interface.yml' 22736 1727204271.07757: in run() - task 12b410aa-8751-4f4a-548a-000000000054 22736 1727204271.07798: variable 'ansible_search_path' from source: unknown 22736 1727204271.07811: calling self._execute() 22736 1727204271.07893: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204271.07898: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204271.07910: variable 'omit' from source: magic vars 22736 1727204271.08348: variable 'ansible_distribution_major_version' from source: facts 22736 1727204271.08358: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204271.08369: _execute() done 22736 1727204271.08373: dumping result to json 22736 1727204271.08376: done dumping result, returning 22736 1727204271.08381: done running TaskExecutor() for managed-node2/TASK: Include the task 'delete_interface.yml' [12b410aa-8751-4f4a-548a-000000000054] 22736 1727204271.08387: sending task result for task 12b410aa-8751-4f4a-548a-000000000054 22736 1727204271.08664: done sending task result for task 12b410aa-8751-4f4a-548a-000000000054 22736 1727204271.08667: WORKER PROCESS EXITING 22736 1727204271.08694: no more pending results, returning what we have 22736 1727204271.08698: in VariableManager get_vars() 22736 1727204271.08761: Calling all_inventory to load vars for managed-node2 22736 1727204271.08765: Calling groups_inventory to load vars for managed-node2 22736 1727204271.08769: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204271.08781: Calling all_plugins_play to load vars for managed-node2 22736 1727204271.08783: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204271.08786: Calling groups_plugins_play to load vars for managed-node2 22736 1727204271.10302: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204271.12325: done with get_vars() 22736 1727204271.12360: variable 'ansible_search_path' from source: unknown 22736 1727204271.12377: we have included files to process 22736 1727204271.12378: generating all_blocks data 22736 1727204271.12380: done generating all_blocks data 22736 1727204271.12381: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 22736 1727204271.12382: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 22736 1727204271.12385: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml 22736 1727204271.12652: done processing included file 22736 1727204271.12655: iterating over new_blocks loaded from include file 22736 1727204271.12657: in VariableManager get_vars() 22736 1727204271.12672: done with get_vars() 22736 1727204271.12674: filtering new block on tags 22736 1727204271.12695: done filtering new block on tags 22736 1727204271.12698: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml for managed-node2 22736 1727204271.12704: extending task lists for all hosts with included blocks 22736 1727204271.12745: done extending task lists 22736 1727204271.12747: done processing included files 22736 1727204271.12748: results queue empty 22736 1727204271.12749: checking for any_errors_fatal 22736 1727204271.12751: done checking for any_errors_fatal 22736 1727204271.12752: checking for max_fail_percentage 22736 1727204271.12753: done checking for max_fail_percentage 22736 1727204271.12754: checking to see if all hosts have failed and the running result is not ok 22736 1727204271.12755: done checking to see if all hosts have failed 22736 1727204271.12756: getting the remaining hosts for this loop 22736 1727204271.12757: done getting the remaining hosts for this loop 22736 1727204271.12760: getting the next task for host managed-node2 22736 1727204271.12765: done getting next task for host managed-node2 22736 1727204271.12767: ^ task is: TASK: Remove test interface if necessary 22736 1727204271.12770: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204271.12773: getting variables 22736 1727204271.12774: in VariableManager get_vars() 22736 1727204271.12784: Calling all_inventory to load vars for managed-node2 22736 1727204271.12787: Calling groups_inventory to load vars for managed-node2 22736 1727204271.12793: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204271.12799: Calling all_plugins_play to load vars for managed-node2 22736 1727204271.12803: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204271.12807: Calling groups_plugins_play to load vars for managed-node2 22736 1727204271.14040: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204271.15594: done with get_vars() 22736 1727204271.15616: done getting variables 22736 1727204271.15652: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Remove test interface if necessary] ************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/delete_interface.yml:3 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.087) 0:00:35.941 ***** 22736 1727204271.15677: entering _queue_task() for managed-node2/command 22736 1727204271.16060: worker is 1 (out of 1 available) 22736 1727204271.16077: exiting _queue_task() for managed-node2/command 22736 1727204271.16096: done queuing things up, now waiting for results queue to drain 22736 1727204271.16098: waiting for pending results... 22736 1727204271.17252: running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary 22736 1727204271.17481: in run() - task 12b410aa-8751-4f4a-548a-000000000409 22736 1727204271.17578: variable 'ansible_search_path' from source: unknown 22736 1727204271.17673: variable 'ansible_search_path' from source: unknown 22736 1727204271.17677: calling self._execute() 22736 1727204271.17939: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204271.18003: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204271.18025: variable 'omit' from source: magic vars 22736 1727204271.18925: variable 'ansible_distribution_major_version' from source: facts 22736 1727204271.18930: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204271.18932: variable 'omit' from source: magic vars 22736 1727204271.19084: variable 'omit' from source: magic vars 22736 1727204271.19265: variable 'interface' from source: set_fact 22736 1727204271.19299: variable 'omit' from source: magic vars 22736 1727204271.19385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204271.19455: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204271.19470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204271.19505: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204271.19568: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204271.19578: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204271.19588: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204271.19606: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204271.19756: Set connection var ansible_timeout to 10 22736 1727204271.19785: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204271.19802: Set connection var ansible_shell_executable to /bin/sh 22736 1727204271.19806: Set connection var ansible_shell_type to sh 22736 1727204271.19815: Set connection var ansible_pipelining to False 22736 1727204271.19818: Set connection var ansible_connection to ssh 22736 1727204271.19839: variable 'ansible_shell_executable' from source: unknown 22736 1727204271.19843: variable 'ansible_connection' from source: unknown 22736 1727204271.19845: variable 'ansible_module_compression' from source: unknown 22736 1727204271.19848: variable 'ansible_shell_type' from source: unknown 22736 1727204271.19852: variable 'ansible_shell_executable' from source: unknown 22736 1727204271.19855: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204271.19860: variable 'ansible_pipelining' from source: unknown 22736 1727204271.19863: variable 'ansible_timeout' from source: unknown 22736 1727204271.19868: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204271.20002: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204271.20010: variable 'omit' from source: magic vars 22736 1727204271.20013: starting attempt loop 22736 1727204271.20021: running the handler 22736 1727204271.20039: _low_level_execute_command(): starting 22736 1727204271.20046: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204271.20600: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204271.20605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 22736 1727204271.20608: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204271.20659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204271.20663: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204271.20716: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204271.22515: stdout chunk (state=3): >>>/root <<< 22736 1727204271.22646: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204271.22693: stderr chunk (state=3): >>><<< 22736 1727204271.22697: stdout chunk (state=3): >>><<< 22736 1727204271.22717: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204271.22804: _low_level_execute_command(): starting 22736 1727204271.22809: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204271.2272422-24589-244174028848713 `" && echo ansible-tmp-1727204271.2272422-24589-244174028848713="` echo /root/.ansible/tmp/ansible-tmp-1727204271.2272422-24589-244174028848713 `" ) && sleep 0' 22736 1727204271.23412: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204271.23480: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204271.23501: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204271.23541: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204271.23615: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204271.25701: stdout chunk (state=3): >>>ansible-tmp-1727204271.2272422-24589-244174028848713=/root/.ansible/tmp/ansible-tmp-1727204271.2272422-24589-244174028848713 <<< 22736 1727204271.25858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204271.25883: stderr chunk (state=3): >>><<< 22736 1727204271.25887: stdout chunk (state=3): >>><<< 22736 1727204271.25906: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204271.2272422-24589-244174028848713=/root/.ansible/tmp/ansible-tmp-1727204271.2272422-24589-244174028848713 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204271.25944: variable 'ansible_module_compression' from source: unknown 22736 1727204271.26022: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22736 1727204271.26067: variable 'ansible_facts' from source: unknown 22736 1727204271.26136: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204271.2272422-24589-244174028848713/AnsiballZ_command.py 22736 1727204271.26274: Sending initial data 22736 1727204271.26278: Sent initial data (156 bytes) 22736 1727204271.26961: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22736 1727204271.26986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204271.27006: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204271.27093: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204271.28842: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204271.28883: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204271.28932: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpbaphdnp2 /root/.ansible/tmp/ansible-tmp-1727204271.2272422-24589-244174028848713/AnsiballZ_command.py <<< 22736 1727204271.28936: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204271.2272422-24589-244174028848713/AnsiballZ_command.py" <<< 22736 1727204271.28982: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpbaphdnp2" to remote "/root/.ansible/tmp/ansible-tmp-1727204271.2272422-24589-244174028848713/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204271.2272422-24589-244174028848713/AnsiballZ_command.py" <<< 22736 1727204271.31572: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204271.31634: stderr chunk (state=3): >>><<< 22736 1727204271.31645: stdout chunk (state=3): >>><<< 22736 1727204271.31696: done transferring module to remote 22736 1727204271.31793: _low_level_execute_command(): starting 22736 1727204271.31797: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204271.2272422-24589-244174028848713/ /root/.ansible/tmp/ansible-tmp-1727204271.2272422-24589-244174028848713/AnsiballZ_command.py && sleep 0' 22736 1727204271.32850: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204271.32879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204271.32996: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204271.33020: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204271.33117: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204271.33233: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204271.35409: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204271.35412: stdout chunk (state=3): >>><<< 22736 1727204271.35418: stderr chunk (state=3): >>><<< 22736 1727204271.35468: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204271.35583: _low_level_execute_command(): starting 22736 1727204271.35587: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204271.2272422-24589-244174028848713/AnsiballZ_command.py && sleep 0' 22736 1727204271.36206: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204271.36234: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204271.36253: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204271.36278: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204271.36361: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204271.55219: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-24 14:57:51.535752", "end": "2024-09-24 14:57:51.547684", "delta": "0:00:00.011932", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22736 1727204271.57357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204271.57361: stdout chunk (state=3): >>><<< 22736 1727204271.57364: stderr chunk (state=3): >>><<< 22736 1727204271.57559: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 0, "cmd": ["ip", "link", "del", "lsr27"], "start": "2024-09-24 14:57:51.535752", "end": "2024-09-24 14:57:51.547684", "delta": "0:00:00.011932", "msg": "", "invocation": {"module_args": {"_raw_params": "ip link del lsr27", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204271.57564: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204271.2272422-24589-244174028848713/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204271.57567: _low_level_execute_command(): starting 22736 1727204271.57570: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204271.2272422-24589-244174028848713/ > /dev/null 2>&1 && sleep 0' 22736 1727204271.58206: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204271.58235: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204271.58258: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204271.58318: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204271.58336: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204271.58372: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204271.60317: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204271.60371: stderr chunk (state=3): >>><<< 22736 1727204271.60374: stdout chunk (state=3): >>><<< 22736 1727204271.60394: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204271.60404: handler run complete 22736 1727204271.60429: Evaluated conditional (False): False 22736 1727204271.60439: attempt loop complete, returning result 22736 1727204271.60442: _execute() done 22736 1727204271.60447: dumping result to json 22736 1727204271.60453: done dumping result, returning 22736 1727204271.60462: done running TaskExecutor() for managed-node2/TASK: Remove test interface if necessary [12b410aa-8751-4f4a-548a-000000000409] 22736 1727204271.60467: sending task result for task 12b410aa-8751-4f4a-548a-000000000409 22736 1727204271.60575: done sending task result for task 12b410aa-8751-4f4a-548a-000000000409 22736 1727204271.60579: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": [ "ip", "link", "del", "lsr27" ], "delta": "0:00:00.011932", "end": "2024-09-24 14:57:51.547684", "rc": 0, "start": "2024-09-24 14:57:51.535752" } 22736 1727204271.60660: no more pending results, returning what we have 22736 1727204271.60665: results queue empty 22736 1727204271.60666: checking for any_errors_fatal 22736 1727204271.60668: done checking for any_errors_fatal 22736 1727204271.60669: checking for max_fail_percentage 22736 1727204271.60670: done checking for max_fail_percentage 22736 1727204271.60671: checking to see if all hosts have failed and the running result is not ok 22736 1727204271.60672: done checking to see if all hosts have failed 22736 1727204271.60673: getting the remaining hosts for this loop 22736 1727204271.60675: done getting the remaining hosts for this loop 22736 1727204271.60680: getting the next task for host managed-node2 22736 1727204271.60688: done getting next task for host managed-node2 22736 1727204271.60693: ^ task is: TASK: meta (flush_handlers) 22736 1727204271.60696: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204271.60701: getting variables 22736 1727204271.60703: in VariableManager get_vars() 22736 1727204271.60738: Calling all_inventory to load vars for managed-node2 22736 1727204271.60741: Calling groups_inventory to load vars for managed-node2 22736 1727204271.60745: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204271.60758: Calling all_plugins_play to load vars for managed-node2 22736 1727204271.60763: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204271.60766: Calling groups_plugins_play to load vars for managed-node2 22736 1727204271.62606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204271.64188: done with get_vars() 22736 1727204271.64215: done getting variables 22736 1727204271.64275: in VariableManager get_vars() 22736 1727204271.64285: Calling all_inventory to load vars for managed-node2 22736 1727204271.64287: Calling groups_inventory to load vars for managed-node2 22736 1727204271.64291: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204271.64296: Calling all_plugins_play to load vars for managed-node2 22736 1727204271.64298: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204271.64300: Calling groups_plugins_play to load vars for managed-node2 22736 1727204271.65746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204271.68629: done with get_vars() 22736 1727204271.68677: done queuing things up, now waiting for results queue to drain 22736 1727204271.68680: results queue empty 22736 1727204271.68681: checking for any_errors_fatal 22736 1727204271.68686: done checking for any_errors_fatal 22736 1727204271.68687: checking for max_fail_percentage 22736 1727204271.68690: done checking for max_fail_percentage 22736 1727204271.68691: checking to see if all hosts have failed and the running result is not ok 22736 1727204271.68692: done checking to see if all hosts have failed 22736 1727204271.68693: getting the remaining hosts for this loop 22736 1727204271.68695: done getting the remaining hosts for this loop 22736 1727204271.68698: getting the next task for host managed-node2 22736 1727204271.68703: done getting next task for host managed-node2 22736 1727204271.68705: ^ task is: TASK: meta (flush_handlers) 22736 1727204271.68707: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204271.68711: getting variables 22736 1727204271.68712: in VariableManager get_vars() 22736 1727204271.68723: Calling all_inventory to load vars for managed-node2 22736 1727204271.68726: Calling groups_inventory to load vars for managed-node2 22736 1727204271.68729: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204271.68738: Calling all_plugins_play to load vars for managed-node2 22736 1727204271.68741: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204271.68745: Calling groups_plugins_play to load vars for managed-node2 22736 1727204271.70796: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204271.73667: done with get_vars() 22736 1727204271.73710: done getting variables 22736 1727204271.73776: in VariableManager get_vars() 22736 1727204271.73791: Calling all_inventory to load vars for managed-node2 22736 1727204271.73794: Calling groups_inventory to load vars for managed-node2 22736 1727204271.73797: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204271.73804: Calling all_plugins_play to load vars for managed-node2 22736 1727204271.73807: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204271.73811: Calling groups_plugins_play to load vars for managed-node2 22736 1727204271.75762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204271.78647: done with get_vars() 22736 1727204271.78695: done queuing things up, now waiting for results queue to drain 22736 1727204271.78698: results queue empty 22736 1727204271.78699: checking for any_errors_fatal 22736 1727204271.78701: done checking for any_errors_fatal 22736 1727204271.78701: checking for max_fail_percentage 22736 1727204271.78703: done checking for max_fail_percentage 22736 1727204271.78703: checking to see if all hosts have failed and the running result is not ok 22736 1727204271.78704: done checking to see if all hosts have failed 22736 1727204271.78705: getting the remaining hosts for this loop 22736 1727204271.78706: done getting the remaining hosts for this loop 22736 1727204271.78709: getting the next task for host managed-node2 22736 1727204271.78713: done getting next task for host managed-node2 22736 1727204271.78714: ^ task is: None 22736 1727204271.78716: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204271.78717: done queuing things up, now waiting for results queue to drain 22736 1727204271.78718: results queue empty 22736 1727204271.78719: checking for any_errors_fatal 22736 1727204271.78720: done checking for any_errors_fatal 22736 1727204271.78721: checking for max_fail_percentage 22736 1727204271.78722: done checking for max_fail_percentage 22736 1727204271.78722: checking to see if all hosts have failed and the running result is not ok 22736 1727204271.78723: done checking to see if all hosts have failed 22736 1727204271.78725: getting the next task for host managed-node2 22736 1727204271.78735: done getting next task for host managed-node2 22736 1727204271.78736: ^ task is: None 22736 1727204271.78737: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204271.78784: in VariableManager get_vars() 22736 1727204271.78814: done with get_vars() 22736 1727204271.78821: in VariableManager get_vars() 22736 1727204271.78839: done with get_vars() 22736 1727204271.78845: variable 'omit' from source: magic vars 22736 1727204271.78985: variable 'profile' from source: play vars 22736 1727204271.79113: in VariableManager get_vars() 22736 1727204271.79131: done with get_vars() 22736 1727204271.79157: variable 'omit' from source: magic vars 22736 1727204271.79239: variable 'profile' from source: play vars PLAY [Remove {{ profile }}] **************************************************** 22736 1727204271.80216: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22736 1727204271.80243: getting the remaining hosts for this loop 22736 1727204271.80244: done getting the remaining hosts for this loop 22736 1727204271.80248: getting the next task for host managed-node2 22736 1727204271.80251: done getting next task for host managed-node2 22736 1727204271.80254: ^ task is: TASK: Gathering Facts 22736 1727204271.80255: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204271.80258: getting variables 22736 1727204271.80259: in VariableManager get_vars() 22736 1727204271.80273: Calling all_inventory to load vars for managed-node2 22736 1727204271.80276: Calling groups_inventory to load vars for managed-node2 22736 1727204271.80279: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204271.80286: Calling all_plugins_play to load vars for managed-node2 22736 1727204271.80292: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204271.80297: Calling groups_plugins_play to load vars for managed-node2 22736 1727204271.86309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204271.87861: done with get_vars() 22736 1727204271.87883: done getting variables 22736 1727204271.87923: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 Tuesday 24 September 2024 14:57:51 -0400 (0:00:00.722) 0:00:36.664 ***** 22736 1727204271.87945: entering _queue_task() for managed-node2/gather_facts 22736 1727204271.88234: worker is 1 (out of 1 available) 22736 1727204271.88254: exiting _queue_task() for managed-node2/gather_facts 22736 1727204271.88266: done queuing things up, now waiting for results queue to drain 22736 1727204271.88268: waiting for pending results... 22736 1727204271.88570: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22736 1727204271.88743: in run() - task 12b410aa-8751-4f4a-548a-000000000417 22736 1727204271.88746: variable 'ansible_search_path' from source: unknown 22736 1727204271.88995: calling self._execute() 22736 1727204271.88999: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204271.89002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204271.89005: variable 'omit' from source: magic vars 22736 1727204271.89399: variable 'ansible_distribution_major_version' from source: facts 22736 1727204271.89418: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204271.89444: variable 'omit' from source: magic vars 22736 1727204271.89470: variable 'omit' from source: magic vars 22736 1727204271.89560: variable 'omit' from source: magic vars 22736 1727204271.89569: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204271.89616: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204271.89647: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204271.89684: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204271.89706: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204271.89749: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204271.89759: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204271.89778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204271.89996: Set connection var ansible_timeout to 10 22736 1727204271.90002: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204271.90005: Set connection var ansible_shell_executable to /bin/sh 22736 1727204271.90007: Set connection var ansible_shell_type to sh 22736 1727204271.90009: Set connection var ansible_pipelining to False 22736 1727204271.90012: Set connection var ansible_connection to ssh 22736 1727204271.90014: variable 'ansible_shell_executable' from source: unknown 22736 1727204271.90016: variable 'ansible_connection' from source: unknown 22736 1727204271.90020: variable 'ansible_module_compression' from source: unknown 22736 1727204271.90030: variable 'ansible_shell_type' from source: unknown 22736 1727204271.90040: variable 'ansible_shell_executable' from source: unknown 22736 1727204271.90048: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204271.90096: variable 'ansible_pipelining' from source: unknown 22736 1727204271.90099: variable 'ansible_timeout' from source: unknown 22736 1727204271.90101: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204271.90295: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204271.90314: variable 'omit' from source: magic vars 22736 1727204271.90325: starting attempt loop 22736 1727204271.90334: running the handler 22736 1727204271.90364: variable 'ansible_facts' from source: unknown 22736 1727204271.90426: _low_level_execute_command(): starting 22736 1727204271.90429: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204271.91057: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204271.91081: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204271.91110: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204271.91153: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204271.91156: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204271.91236: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204271.93113: stdout chunk (state=3): >>>/root <<< 22736 1727204271.93262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204271.93266: stdout chunk (state=3): >>><<< 22736 1727204271.93269: stderr chunk (state=3): >>><<< 22736 1727204271.93418: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204271.93423: _low_level_execute_command(): starting 22736 1727204271.93426: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204271.933092-24611-102694757153175 `" && echo ansible-tmp-1727204271.933092-24611-102694757153175="` echo /root/.ansible/tmp/ansible-tmp-1727204271.933092-24611-102694757153175 `" ) && sleep 0' 22736 1727204271.94082: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204271.94136: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204271.94157: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22736 1727204271.94250: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204271.94304: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204271.94321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204271.94481: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204271.96491: stdout chunk (state=3): >>>ansible-tmp-1727204271.933092-24611-102694757153175=/root/.ansible/tmp/ansible-tmp-1727204271.933092-24611-102694757153175 <<< 22736 1727204271.96610: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204271.96658: stderr chunk (state=3): >>><<< 22736 1727204271.96662: stdout chunk (state=3): >>><<< 22736 1727204271.96680: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204271.933092-24611-102694757153175=/root/.ansible/tmp/ansible-tmp-1727204271.933092-24611-102694757153175 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204271.96713: variable 'ansible_module_compression' from source: unknown 22736 1727204271.96766: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22736 1727204271.96818: variable 'ansible_facts' from source: unknown 22736 1727204271.96929: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204271.933092-24611-102694757153175/AnsiballZ_setup.py 22736 1727204271.97051: Sending initial data 22736 1727204271.97054: Sent initial data (153 bytes) 22736 1727204271.97517: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204271.97521: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204271.97526: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 22736 1727204271.97529: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204271.97531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204271.97582: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204271.97588: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204271.97626: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204271.99266: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204271.99325: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204271.99356: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpecfvvm6x /root/.ansible/tmp/ansible-tmp-1727204271.933092-24611-102694757153175/AnsiballZ_setup.py <<< 22736 1727204271.99360: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204271.933092-24611-102694757153175/AnsiballZ_setup.py" <<< 22736 1727204271.99422: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpecfvvm6x" to remote "/root/.ansible/tmp/ansible-tmp-1727204271.933092-24611-102694757153175/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204271.933092-24611-102694757153175/AnsiballZ_setup.py" <<< 22736 1727204272.01430: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204272.01503: stderr chunk (state=3): >>><<< 22736 1727204272.01506: stdout chunk (state=3): >>><<< 22736 1727204272.01539: done transferring module to remote 22736 1727204272.01591: _low_level_execute_command(): starting 22736 1727204272.01597: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204271.933092-24611-102694757153175/ /root/.ansible/tmp/ansible-tmp-1727204271.933092-24611-102694757153175/AnsiballZ_setup.py && sleep 0' 22736 1727204272.02354: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204272.02393: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204272.02413: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204272.02431: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204272.04265: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204272.04324: stderr chunk (state=3): >>><<< 22736 1727204272.04329: stdout chunk (state=3): >>><<< 22736 1727204272.04350: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204272.04359: _low_level_execute_command(): starting 22736 1727204272.04362: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204271.933092-24611-102694757153175/AnsiballZ_setup.py && sleep 0' 22736 1727204272.04912: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204272.04916: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204272.04922: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204272.04924: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204272.04927: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204272.04984: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204272.04986: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204272.05026: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204272.72864: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.8212890625, "5m": 0.6650390625, "15m": 0.41259765625}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL<<< 22736 1727204272.72901: stdout chunk (state=3): >>>_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "52", "epoch": "1727204272", "epoch_int": "1727204272", "date": "2024-09-24", "time": "14:57:52", "iso8601_micro": "2024-09-24T18:57:52.363569Z", "iso8601": "2024-09-24T18:57:52Z", "iso8601_basic": "20240924T145752363569", "iso8601_basic_short": "20240924T145752", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2835, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 882, "free": 2835}, "nocache": {"free": 3466, "used": 251}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 776, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146850304, "block_size": 4096, "block_total": 64479564, "block_available": 61315149, "block_used": 3164415, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22736 1727204272.75010: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204272.75015: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 22736 1727204272.75296: stderr chunk (state=3): >>><<< 22736 1727204272.75300: stdout chunk (state=3): >>><<< 22736 1727204272.75303: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_loadavg": {"1m": 0.8212890625, "5m": 0.6650390625, "15m": 0.41259765625}, "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_pkg_mgr": "dnf", "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_hostnqn": "", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "57", "second": "52", "epoch": "1727204272", "epoch_int": "1727204272", "date": "2024-09-24", "time": "14:57:52", "iso8601_micro": "2024-09-24T18:57:52.363569Z", "iso8601": "2024-09-24T18:57:52Z", "iso8601_basic": "20240924T145752363569", "iso8601_basic_short": "20240924T145752", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_local": {}, "ansible_fibre_channel_wwn": [], "ansible_lsb": {}, "ansible_is_chroot": false, "ansible_apparmor": {"status": "disabled"}, "ansible_iscsi_iqn": "", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2835, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 882, "free": 2835}, "nocache": {"free": 3466, "used": 251}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 776, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146850304, "block_size": 4096, "block_total": 64479564, "block_available": 61315149, "block_used": 3164415, "inode_total": 16384000, "inode_available": 16302248, "inode_used": 81752, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_interfaces": ["eth0", "lo"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204272.75671: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204271.933092-24611-102694757153175/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204272.75693: _low_level_execute_command(): starting 22736 1727204272.75704: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204271.933092-24611-102694757153175/ > /dev/null 2>&1 && sleep 0' 22736 1727204272.76431: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204272.76447: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204272.76463: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204272.76538: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204272.76606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204272.76639: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204272.76665: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204272.76749: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204272.78995: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204272.78999: stdout chunk (state=3): >>><<< 22736 1727204272.79002: stderr chunk (state=3): >>><<< 22736 1727204272.79005: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204272.79008: handler run complete 22736 1727204272.79086: variable 'ansible_facts' from source: unknown 22736 1727204272.79259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204272.79790: variable 'ansible_facts' from source: unknown 22736 1727204272.79925: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204272.80194: attempt loop complete, returning result 22736 1727204272.80197: _execute() done 22736 1727204272.80200: dumping result to json 22736 1727204272.80202: done dumping result, returning 22736 1727204272.80204: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-4f4a-548a-000000000417] 22736 1727204272.80217: sending task result for task 12b410aa-8751-4f4a-548a-000000000417 ok: [managed-node2] 22736 1727204272.81425: done sending task result for task 12b410aa-8751-4f4a-548a-000000000417 22736 1727204272.81430: WORKER PROCESS EXITING 22736 1727204272.81467: no more pending results, returning what we have 22736 1727204272.81471: results queue empty 22736 1727204272.81472: checking for any_errors_fatal 22736 1727204272.81474: done checking for any_errors_fatal 22736 1727204272.81475: checking for max_fail_percentage 22736 1727204272.81476: done checking for max_fail_percentage 22736 1727204272.81478: checking to see if all hosts have failed and the running result is not ok 22736 1727204272.81479: done checking to see if all hosts have failed 22736 1727204272.81480: getting the remaining hosts for this loop 22736 1727204272.81481: done getting the remaining hosts for this loop 22736 1727204272.81485: getting the next task for host managed-node2 22736 1727204272.81494: done getting next task for host managed-node2 22736 1727204272.81496: ^ task is: TASK: meta (flush_handlers) 22736 1727204272.81499: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204272.81503: getting variables 22736 1727204272.81505: in VariableManager get_vars() 22736 1727204272.81547: Calling all_inventory to load vars for managed-node2 22736 1727204272.81552: Calling groups_inventory to load vars for managed-node2 22736 1727204272.81555: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204272.81572: Calling all_plugins_play to load vars for managed-node2 22736 1727204272.81576: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204272.81580: Calling groups_plugins_play to load vars for managed-node2 22736 1727204272.84068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204272.86156: done with get_vars() 22736 1727204272.86181: done getting variables 22736 1727204272.86248: in VariableManager get_vars() 22736 1727204272.86259: Calling all_inventory to load vars for managed-node2 22736 1727204272.86261: Calling groups_inventory to load vars for managed-node2 22736 1727204272.86263: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204272.86267: Calling all_plugins_play to load vars for managed-node2 22736 1727204272.86269: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204272.86271: Calling groups_plugins_play to load vars for managed-node2 22736 1727204272.87396: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204272.90284: done with get_vars() 22736 1727204272.90316: done queuing things up, now waiting for results queue to drain 22736 1727204272.90320: results queue empty 22736 1727204272.90321: checking for any_errors_fatal 22736 1727204272.90326: done checking for any_errors_fatal 22736 1727204272.90327: checking for max_fail_percentage 22736 1727204272.90329: done checking for max_fail_percentage 22736 1727204272.90329: checking to see if all hosts have failed and the running result is not ok 22736 1727204272.90334: done checking to see if all hosts have failed 22736 1727204272.90334: getting the remaining hosts for this loop 22736 1727204272.90335: done getting the remaining hosts for this loop 22736 1727204272.90338: getting the next task for host managed-node2 22736 1727204272.90341: done getting next task for host managed-node2 22736 1727204272.90344: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22736 1727204272.90345: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204272.90355: getting variables 22736 1727204272.90356: in VariableManager get_vars() 22736 1727204272.90368: Calling all_inventory to load vars for managed-node2 22736 1727204272.90370: Calling groups_inventory to load vars for managed-node2 22736 1727204272.90372: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204272.90376: Calling all_plugins_play to load vars for managed-node2 22736 1727204272.90378: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204272.90380: Calling groups_plugins_play to load vars for managed-node2 22736 1727204272.91496: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204272.93065: done with get_vars() 22736 1727204272.93087: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Tuesday 24 September 2024 14:57:52 -0400 (0:00:01.052) 0:00:37.716 ***** 22736 1727204272.93173: entering _queue_task() for managed-node2/include_tasks 22736 1727204272.93569: worker is 1 (out of 1 available) 22736 1727204272.93583: exiting _queue_task() for managed-node2/include_tasks 22736 1727204272.93799: done queuing things up, now waiting for results queue to drain 22736 1727204272.93801: waiting for pending results... 22736 1727204272.93927: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 22736 1727204272.94072: in run() - task 12b410aa-8751-4f4a-548a-00000000005c 22736 1727204272.94099: variable 'ansible_search_path' from source: unknown 22736 1727204272.94113: variable 'ansible_search_path' from source: unknown 22736 1727204272.94164: calling self._execute() 22736 1727204272.94279: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204272.94298: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204272.94321: variable 'omit' from source: magic vars 22736 1727204272.94694: variable 'ansible_distribution_major_version' from source: facts 22736 1727204272.94709: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204272.94713: _execute() done 22736 1727204272.94719: dumping result to json 22736 1727204272.94723: done dumping result, returning 22736 1727204272.94727: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12b410aa-8751-4f4a-548a-00000000005c] 22736 1727204272.94731: sending task result for task 12b410aa-8751-4f4a-548a-00000000005c 22736 1727204272.94896: no more pending results, returning what we have 22736 1727204272.94901: in VariableManager get_vars() 22736 1727204272.94963: Calling all_inventory to load vars for managed-node2 22736 1727204272.94967: Calling groups_inventory to load vars for managed-node2 22736 1727204272.94969: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204272.94976: done sending task result for task 12b410aa-8751-4f4a-548a-00000000005c 22736 1727204272.94979: WORKER PROCESS EXITING 22736 1727204272.94990: Calling all_plugins_play to load vars for managed-node2 22736 1727204272.94994: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204272.94998: Calling groups_plugins_play to load vars for managed-node2 22736 1727204272.96782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204273.01296: done with get_vars() 22736 1727204273.01335: variable 'ansible_search_path' from source: unknown 22736 1727204273.01337: variable 'ansible_search_path' from source: unknown 22736 1727204273.01384: we have included files to process 22736 1727204273.01386: generating all_blocks data 22736 1727204273.01388: done generating all_blocks data 22736 1727204273.01391: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22736 1727204273.01393: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22736 1727204273.01396: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 22736 1727204273.02072: done processing included file 22736 1727204273.02075: iterating over new_blocks loaded from include file 22736 1727204273.02077: in VariableManager get_vars() 22736 1727204273.02112: done with get_vars() 22736 1727204273.02113: filtering new block on tags 22736 1727204273.02131: done filtering new block on tags 22736 1727204273.02133: in VariableManager get_vars() 22736 1727204273.02152: done with get_vars() 22736 1727204273.02154: filtering new block on tags 22736 1727204273.02178: done filtering new block on tags 22736 1727204273.02180: in VariableManager get_vars() 22736 1727204273.02200: done with get_vars() 22736 1727204273.02202: filtering new block on tags 22736 1727204273.02216: done filtering new block on tags 22736 1727204273.02219: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed-node2 22736 1727204273.02223: extending task lists for all hosts with included blocks 22736 1727204273.02655: done extending task lists 22736 1727204273.02657: done processing included files 22736 1727204273.02657: results queue empty 22736 1727204273.02658: checking for any_errors_fatal 22736 1727204273.02659: done checking for any_errors_fatal 22736 1727204273.02660: checking for max_fail_percentage 22736 1727204273.02661: done checking for max_fail_percentage 22736 1727204273.02661: checking to see if all hosts have failed and the running result is not ok 22736 1727204273.02662: done checking to see if all hosts have failed 22736 1727204273.02663: getting the remaining hosts for this loop 22736 1727204273.02664: done getting the remaining hosts for this loop 22736 1727204273.02665: getting the next task for host managed-node2 22736 1727204273.02668: done getting next task for host managed-node2 22736 1727204273.02671: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22736 1727204273.02673: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204273.02680: getting variables 22736 1727204273.02681: in VariableManager get_vars() 22736 1727204273.02700: Calling all_inventory to load vars for managed-node2 22736 1727204273.02707: Calling groups_inventory to load vars for managed-node2 22736 1727204273.02710: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204273.02717: Calling all_plugins_play to load vars for managed-node2 22736 1727204273.02721: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204273.02725: Calling groups_plugins_play to load vars for managed-node2 22736 1727204273.04081: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204273.05844: done with get_vars() 22736 1727204273.05867: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Tuesday 24 September 2024 14:57:53 -0400 (0:00:00.127) 0:00:37.844 ***** 22736 1727204273.05942: entering _queue_task() for managed-node2/setup 22736 1727204273.06233: worker is 1 (out of 1 available) 22736 1727204273.06248: exiting _queue_task() for managed-node2/setup 22736 1727204273.06262: done queuing things up, now waiting for results queue to drain 22736 1727204273.06264: waiting for pending results... 22736 1727204273.06456: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 22736 1727204273.06560: in run() - task 12b410aa-8751-4f4a-548a-000000000458 22736 1727204273.06573: variable 'ansible_search_path' from source: unknown 22736 1727204273.06577: variable 'ansible_search_path' from source: unknown 22736 1727204273.06621: calling self._execute() 22736 1727204273.06688: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204273.06698: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204273.06709: variable 'omit' from source: magic vars 22736 1727204273.07038: variable 'ansible_distribution_major_version' from source: facts 22736 1727204273.07053: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204273.07240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204273.08969: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204273.09030: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204273.09064: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204273.09095: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204273.09168: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204273.09225: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204273.09282: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204273.09295: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204273.09328: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204273.09361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204273.09466: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204273.09478: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204273.09515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204273.09573: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204273.09586: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204273.09787: variable '__network_required_facts' from source: role '' defaults 22736 1727204273.09798: variable 'ansible_facts' from source: unknown 22736 1727204273.10735: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 22736 1727204273.10739: when evaluation is False, skipping this task 22736 1727204273.10747: _execute() done 22736 1727204273.10751: dumping result to json 22736 1727204273.10754: done dumping result, returning 22736 1727204273.10757: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12b410aa-8751-4f4a-548a-000000000458] 22736 1727204273.10759: sending task result for task 12b410aa-8751-4f4a-548a-000000000458 22736 1727204273.10898: done sending task result for task 12b410aa-8751-4f4a-548a-000000000458 22736 1727204273.10901: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22736 1727204273.11002: no more pending results, returning what we have 22736 1727204273.11006: results queue empty 22736 1727204273.11007: checking for any_errors_fatal 22736 1727204273.11009: done checking for any_errors_fatal 22736 1727204273.11011: checking for max_fail_percentage 22736 1727204273.11013: done checking for max_fail_percentage 22736 1727204273.11014: checking to see if all hosts have failed and the running result is not ok 22736 1727204273.11015: done checking to see if all hosts have failed 22736 1727204273.11016: getting the remaining hosts for this loop 22736 1727204273.11019: done getting the remaining hosts for this loop 22736 1727204273.11025: getting the next task for host managed-node2 22736 1727204273.11036: done getting next task for host managed-node2 22736 1727204273.11040: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 22736 1727204273.11043: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204273.11057: getting variables 22736 1727204273.11058: in VariableManager get_vars() 22736 1727204273.11101: Calling all_inventory to load vars for managed-node2 22736 1727204273.11104: Calling groups_inventory to load vars for managed-node2 22736 1727204273.11106: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204273.11117: Calling all_plugins_play to load vars for managed-node2 22736 1727204273.11122: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204273.11125: Calling groups_plugins_play to load vars for managed-node2 22736 1727204273.12683: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204273.14377: done with get_vars() 22736 1727204273.14405: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Tuesday 24 September 2024 14:57:53 -0400 (0:00:00.085) 0:00:37.929 ***** 22736 1727204273.14488: entering _queue_task() for managed-node2/stat 22736 1727204273.14780: worker is 1 (out of 1 available) 22736 1727204273.14798: exiting _queue_task() for managed-node2/stat 22736 1727204273.14811: done queuing things up, now waiting for results queue to drain 22736 1727204273.14813: waiting for pending results... 22736 1727204273.15008: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree 22736 1727204273.15115: in run() - task 12b410aa-8751-4f4a-548a-00000000045a 22736 1727204273.15130: variable 'ansible_search_path' from source: unknown 22736 1727204273.15134: variable 'ansible_search_path' from source: unknown 22736 1727204273.15173: calling self._execute() 22736 1727204273.15250: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204273.15256: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204273.15273: variable 'omit' from source: magic vars 22736 1727204273.15598: variable 'ansible_distribution_major_version' from source: facts 22736 1727204273.15610: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204273.15817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204273.16074: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204273.16128: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204273.16154: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204273.16269: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204273.16301: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204273.16325: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204273.16353: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204273.16375: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204273.16462: variable '__network_is_ostree' from source: set_fact 22736 1727204273.16472: Evaluated conditional (not __network_is_ostree is defined): False 22736 1727204273.16475: when evaluation is False, skipping this task 22736 1727204273.16478: _execute() done 22736 1727204273.16484: dumping result to json 22736 1727204273.16488: done dumping result, returning 22736 1727204273.16499: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if system is ostree [12b410aa-8751-4f4a-548a-00000000045a] 22736 1727204273.16502: sending task result for task 12b410aa-8751-4f4a-548a-00000000045a 22736 1727204273.16602: done sending task result for task 12b410aa-8751-4f4a-548a-00000000045a 22736 1727204273.16605: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22736 1727204273.16665: no more pending results, returning what we have 22736 1727204273.16669: results queue empty 22736 1727204273.16671: checking for any_errors_fatal 22736 1727204273.16677: done checking for any_errors_fatal 22736 1727204273.16678: checking for max_fail_percentage 22736 1727204273.16680: done checking for max_fail_percentage 22736 1727204273.16681: checking to see if all hosts have failed and the running result is not ok 22736 1727204273.16682: done checking to see if all hosts have failed 22736 1727204273.16683: getting the remaining hosts for this loop 22736 1727204273.16685: done getting the remaining hosts for this loop 22736 1727204273.16691: getting the next task for host managed-node2 22736 1727204273.16699: done getting next task for host managed-node2 22736 1727204273.16708: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22736 1727204273.16712: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204273.16731: getting variables 22736 1727204273.16733: in VariableManager get_vars() 22736 1727204273.16779: Calling all_inventory to load vars for managed-node2 22736 1727204273.16782: Calling groups_inventory to load vars for managed-node2 22736 1727204273.16784: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204273.16803: Calling all_plugins_play to load vars for managed-node2 22736 1727204273.16807: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204273.16810: Calling groups_plugins_play to load vars for managed-node2 22736 1727204273.18245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204273.20002: done with get_vars() 22736 1727204273.20025: done getting variables 22736 1727204273.20074: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Tuesday 24 September 2024 14:57:53 -0400 (0:00:00.056) 0:00:37.985 ***** 22736 1727204273.20107: entering _queue_task() for managed-node2/set_fact 22736 1727204273.20382: worker is 1 (out of 1 available) 22736 1727204273.20401: exiting _queue_task() for managed-node2/set_fact 22736 1727204273.20415: done queuing things up, now waiting for results queue to drain 22736 1727204273.20417: waiting for pending results... 22736 1727204273.20607: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 22736 1727204273.20712: in run() - task 12b410aa-8751-4f4a-548a-00000000045b 22736 1727204273.20728: variable 'ansible_search_path' from source: unknown 22736 1727204273.20732: variable 'ansible_search_path' from source: unknown 22736 1727204273.20770: calling self._execute() 22736 1727204273.20860: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204273.20865: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204273.20883: variable 'omit' from source: magic vars 22736 1727204273.21248: variable 'ansible_distribution_major_version' from source: facts 22736 1727204273.21266: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204273.21461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204273.21694: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204273.21798: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204273.21802: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204273.21840: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204273.21936: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204273.21988: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204273.21994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204273.22031: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204273.22111: variable '__network_is_ostree' from source: set_fact 22736 1727204273.22144: Evaluated conditional (not __network_is_ostree is defined): False 22736 1727204273.22147: when evaluation is False, skipping this task 22736 1727204273.22149: _execute() done 22736 1727204273.22152: dumping result to json 22736 1727204273.22154: done dumping result, returning 22736 1727204273.22157: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12b410aa-8751-4f4a-548a-00000000045b] 22736 1727204273.22160: sending task result for task 12b410aa-8751-4f4a-548a-00000000045b 22736 1727204273.22251: done sending task result for task 12b410aa-8751-4f4a-548a-00000000045b 22736 1727204273.22254: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 22736 1727204273.22322: no more pending results, returning what we have 22736 1727204273.22327: results queue empty 22736 1727204273.22328: checking for any_errors_fatal 22736 1727204273.22338: done checking for any_errors_fatal 22736 1727204273.22339: checking for max_fail_percentage 22736 1727204273.22340: done checking for max_fail_percentage 22736 1727204273.22342: checking to see if all hosts have failed and the running result is not ok 22736 1727204273.22343: done checking to see if all hosts have failed 22736 1727204273.22344: getting the remaining hosts for this loop 22736 1727204273.22345: done getting the remaining hosts for this loop 22736 1727204273.22350: getting the next task for host managed-node2 22736 1727204273.22360: done getting next task for host managed-node2 22736 1727204273.22366: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 22736 1727204273.22369: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204273.22384: getting variables 22736 1727204273.22386: in VariableManager get_vars() 22736 1727204273.22431: Calling all_inventory to load vars for managed-node2 22736 1727204273.22438: Calling groups_inventory to load vars for managed-node2 22736 1727204273.22441: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204273.22452: Calling all_plugins_play to load vars for managed-node2 22736 1727204273.22455: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204273.22458: Calling groups_plugins_play to load vars for managed-node2 22736 1727204273.24020: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204273.25777: done with get_vars() 22736 1727204273.25819: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Tuesday 24 September 2024 14:57:53 -0400 (0:00:00.057) 0:00:38.044 ***** 22736 1727204273.25911: entering _queue_task() for managed-node2/service_facts 22736 1727204273.26211: worker is 1 (out of 1 available) 22736 1727204273.26230: exiting _queue_task() for managed-node2/service_facts 22736 1727204273.26245: done queuing things up, now waiting for results queue to drain 22736 1727204273.26246: waiting for pending results... 22736 1727204273.26441: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running 22736 1727204273.26551: in run() - task 12b410aa-8751-4f4a-548a-00000000045d 22736 1727204273.26563: variable 'ansible_search_path' from source: unknown 22736 1727204273.26566: variable 'ansible_search_path' from source: unknown 22736 1727204273.26606: calling self._execute() 22736 1727204273.26684: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204273.26690: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204273.26704: variable 'omit' from source: magic vars 22736 1727204273.27075: variable 'ansible_distribution_major_version' from source: facts 22736 1727204273.27079: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204273.27082: variable 'omit' from source: magic vars 22736 1727204273.27120: variable 'omit' from source: magic vars 22736 1727204273.27154: variable 'omit' from source: magic vars 22736 1727204273.27198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204273.27233: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204273.27254: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204273.27276: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204273.27288: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204273.27317: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204273.27324: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204273.27329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204273.27420: Set connection var ansible_timeout to 10 22736 1727204273.27433: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204273.27441: Set connection var ansible_shell_executable to /bin/sh 22736 1727204273.27444: Set connection var ansible_shell_type to sh 22736 1727204273.27451: Set connection var ansible_pipelining to False 22736 1727204273.27454: Set connection var ansible_connection to ssh 22736 1727204273.27482: variable 'ansible_shell_executable' from source: unknown 22736 1727204273.27487: variable 'ansible_connection' from source: unknown 22736 1727204273.27491: variable 'ansible_module_compression' from source: unknown 22736 1727204273.27493: variable 'ansible_shell_type' from source: unknown 22736 1727204273.27497: variable 'ansible_shell_executable' from source: unknown 22736 1727204273.27499: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204273.27502: variable 'ansible_pipelining' from source: unknown 22736 1727204273.27504: variable 'ansible_timeout' from source: unknown 22736 1727204273.27509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204273.27698: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204273.27703: variable 'omit' from source: magic vars 22736 1727204273.27709: starting attempt loop 22736 1727204273.27712: running the handler 22736 1727204273.27731: _low_level_execute_command(): starting 22736 1727204273.27738: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204273.28374: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204273.28382: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204273.28433: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204273.28470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204273.30246: stdout chunk (state=3): >>>/root <<< 22736 1727204273.30357: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204273.30448: stderr chunk (state=3): >>><<< 22736 1727204273.30452: stdout chunk (state=3): >>><<< 22736 1727204273.30469: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204273.30484: _low_level_execute_command(): starting 22736 1727204273.30493: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204273.304704-24656-177647430361875 `" && echo ansible-tmp-1727204273.304704-24656-177647430361875="` echo /root/.ansible/tmp/ansible-tmp-1727204273.304704-24656-177647430361875 `" ) && sleep 0' 22736 1727204273.31042: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204273.31046: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204273.31049: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 22736 1727204273.31051: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204273.31061: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204273.31122: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204273.31167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204273.31213: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204273.33304: stdout chunk (state=3): >>>ansible-tmp-1727204273.304704-24656-177647430361875=/root/.ansible/tmp/ansible-tmp-1727204273.304704-24656-177647430361875 <<< 22736 1727204273.33438: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204273.33498: stderr chunk (state=3): >>><<< 22736 1727204273.33502: stdout chunk (state=3): >>><<< 22736 1727204273.33522: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204273.304704-24656-177647430361875=/root/.ansible/tmp/ansible-tmp-1727204273.304704-24656-177647430361875 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204273.33581: variable 'ansible_module_compression' from source: unknown 22736 1727204273.33695: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 22736 1727204273.33699: variable 'ansible_facts' from source: unknown 22736 1727204273.33733: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204273.304704-24656-177647430361875/AnsiballZ_service_facts.py 22736 1727204273.33860: Sending initial data 22736 1727204273.33863: Sent initial data (161 bytes) 22736 1727204273.34423: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204273.34427: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204273.34430: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204273.34470: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204273.34478: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204273.34517: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204273.36188: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204273.36225: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204273.36263: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpya95s4fd /root/.ansible/tmp/ansible-tmp-1727204273.304704-24656-177647430361875/AnsiballZ_service_facts.py <<< 22736 1727204273.36267: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204273.304704-24656-177647430361875/AnsiballZ_service_facts.py" <<< 22736 1727204273.36300: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpya95s4fd" to remote "/root/.ansible/tmp/ansible-tmp-1727204273.304704-24656-177647430361875/AnsiballZ_service_facts.py" <<< 22736 1727204273.36307: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204273.304704-24656-177647430361875/AnsiballZ_service_facts.py" <<< 22736 1727204273.37107: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204273.37185: stderr chunk (state=3): >>><<< 22736 1727204273.37190: stdout chunk (state=3): >>><<< 22736 1727204273.37216: done transferring module to remote 22736 1727204273.37226: _low_level_execute_command(): starting 22736 1727204273.37233: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204273.304704-24656-177647430361875/ /root/.ansible/tmp/ansible-tmp-1727204273.304704-24656-177647430361875/AnsiballZ_service_facts.py && sleep 0' 22736 1727204273.37768: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204273.37772: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204273.37774: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204273.37779: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204273.37862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204273.37870: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204273.37912: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204273.39827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204273.39905: stderr chunk (state=3): >>><<< 22736 1727204273.39909: stdout chunk (state=3): >>><<< 22736 1727204273.39919: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204273.39926: _low_level_execute_command(): starting 22736 1727204273.39961: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204273.304704-24656-177647430361875/AnsiballZ_service_facts.py && sleep 0' 22736 1727204273.40487: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204273.40499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204273.40502: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204273.40505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204273.40579: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204273.40631: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204275.45337: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name"<<< 22736 1727204275.45422: stdout chunk (state=3): >>>: "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": <<< 22736 1727204275.45432: stdout chunk (state=3): >>>"inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 22736 1727204275.47201: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204275.47205: stdout chunk (state=3): >>><<< 22736 1727204275.47207: stderr chunk (state=3): >>><<< 22736 1727204275.47400: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "mdmonitor.service": {"name": "mdmonitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-quit.service": {"name": "plymouth-quit.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-read-write.service": {"name": "plymouth-read-write.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "static", "source": "systemd"}, "plymouth-switch-root.service": {"name": "plymouth-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "raid-check.service": {"name": "raid-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "snapd.seeded.service": {"name": "snapd.seeded.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-plymouth.service": {"name": "systemd-ask-password-plymouth.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-homed-activate.service": {"name": "systemd-homed-activate.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-homed.service": {"name": "systemd-homed.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-networkd.service": {"name": "systemd-networkd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-resolved.service": {"name": "systemd-resolved.service", "state": "running", "status": "enabled", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-userdbd.service": {"name": "systemd-userdbd.service", "state": "running", "status": "indirect", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-zram-setup@zram0.service": {"name": "systemd-zram-setup@zram0.service", "state": "stopped", "status": "active", "source": "systemd"}, "udisks2.service": {"name": "udisks2.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "unbound-anchor.service": {"name": "unbound-anchor.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "arp-ethers.service": {"name": "arp-ethers.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "bluetooth.service": {"name": "bluetooth.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.bluez.service": {"name": "dbus-org.bluez.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.home1.service": {"name": "dbus-org.freedesktop.home1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.oom1.service": {"name": "dbus-org.freedesktop.oom1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.portable1.service": {"name": "dbus-org.freedesktop.portable1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.resolve1.service": {"name": "dbus-org.freedesktop.resolve1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fwupd-offline-update.service": {"name": "fwupd-offline-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd-refresh.service": {"name": "fwupd-refresh.service", "state": "inactive", "status": "static", "source": "systemd"}, "fwupd.service": {"name": "fwupd.service", "state": "inactive", "status": "static", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "mdadm-grow-continue@.service": {"name": "mdadm-grow-continue@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdadm-last-resort@.service": {"name": "mdadm-last-resort@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdcheck_continue.service": {"name": "mdcheck_continue.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdcheck_start.service": {"name": "mdcheck_start.service", "state": "inactive", "status": "static", "source": "systemd"}, "mdmon@.service": {"name": "mdmon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "mdmonitor-oneshot.service": {"name": "mdmonitor-oneshot.service", "state": "inactive", "status": "static", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-halt.service": {"name": "plymouth-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-kexec.service": {"name": "plymouth-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-poweroff.service": {"name": "plymouth-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-reboot.service": {"name": "plymouth-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "plymouth-switch-root-initramfs.service": {"name": "plymouth-switch-root-initramfs.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon.service": {"name": "quotaon.service", "state": "inactive", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-networkd-wait-online@.service": {"name": "systemd-networkd-wait-online@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-portabled.service": {"name": "systemd-portabled.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck.service": {"name": "systemd-quotacheck.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-time-wait-sync.service": {"name": "systemd-time-wait-sync.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-zram-setup@.service": {"name": "systemd-zram-setup@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204275.48567: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204273.304704-24656-177647430361875/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204275.48585: _low_level_execute_command(): starting 22736 1727204275.48607: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204273.304704-24656-177647430361875/ > /dev/null 2>&1 && sleep 0' 22736 1727204275.49404: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204275.49455: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204275.49525: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204275.51547: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204275.51617: stderr chunk (state=3): >>><<< 22736 1727204275.51635: stdout chunk (state=3): >>><<< 22736 1727204275.51653: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204275.51667: handler run complete 22736 1727204275.51982: variable 'ansible_facts' from source: unknown 22736 1727204275.52202: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204275.53015: variable 'ansible_facts' from source: unknown 22736 1727204275.53239: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204275.53621: attempt loop complete, returning result 22736 1727204275.53632: _execute() done 22736 1727204275.53635: dumping result to json 22736 1727204275.53728: done dumping result, returning 22736 1727204275.53736: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which services are running [12b410aa-8751-4f4a-548a-00000000045d] 22736 1727204275.53739: sending task result for task 12b410aa-8751-4f4a-548a-00000000045d ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22736 1727204275.54864: done sending task result for task 12b410aa-8751-4f4a-548a-00000000045d 22736 1727204275.54867: WORKER PROCESS EXITING 22736 1727204275.54878: no more pending results, returning what we have 22736 1727204275.54882: results queue empty 22736 1727204275.54883: checking for any_errors_fatal 22736 1727204275.54890: done checking for any_errors_fatal 22736 1727204275.54891: checking for max_fail_percentage 22736 1727204275.54893: done checking for max_fail_percentage 22736 1727204275.54894: checking to see if all hosts have failed and the running result is not ok 22736 1727204275.54895: done checking to see if all hosts have failed 22736 1727204275.54896: getting the remaining hosts for this loop 22736 1727204275.54898: done getting the remaining hosts for this loop 22736 1727204275.54902: getting the next task for host managed-node2 22736 1727204275.54908: done getting next task for host managed-node2 22736 1727204275.54912: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 22736 1727204275.54915: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204275.54927: getting variables 22736 1727204275.54929: in VariableManager get_vars() 22736 1727204275.54970: Calling all_inventory to load vars for managed-node2 22736 1727204275.54973: Calling groups_inventory to load vars for managed-node2 22736 1727204275.54976: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204275.54987: Calling all_plugins_play to load vars for managed-node2 22736 1727204275.54993: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204275.54997: Calling groups_plugins_play to load vars for managed-node2 22736 1727204275.57245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204275.59509: done with get_vars() 22736 1727204275.59537: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Tuesday 24 September 2024 14:57:55 -0400 (0:00:02.337) 0:00:40.381 ***** 22736 1727204275.59625: entering _queue_task() for managed-node2/package_facts 22736 1727204275.59894: worker is 1 (out of 1 available) 22736 1727204275.59909: exiting _queue_task() for managed-node2/package_facts 22736 1727204275.59926: done queuing things up, now waiting for results queue to drain 22736 1727204275.59928: waiting for pending results... 22736 1727204275.60113: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed 22736 1727204275.60273: in run() - task 12b410aa-8751-4f4a-548a-00000000045e 22736 1727204275.60277: variable 'ansible_search_path' from source: unknown 22736 1727204275.60280: variable 'ansible_search_path' from source: unknown 22736 1727204275.60286: calling self._execute() 22736 1727204275.60368: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204275.60373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204275.60384: variable 'omit' from source: magic vars 22736 1727204275.60696: variable 'ansible_distribution_major_version' from source: facts 22736 1727204275.60709: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204275.60714: variable 'omit' from source: magic vars 22736 1727204275.60766: variable 'omit' from source: magic vars 22736 1727204275.60796: variable 'omit' from source: magic vars 22736 1727204275.60834: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204275.60865: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204275.60885: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204275.60904: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204275.60916: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204275.60948: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204275.60951: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204275.60956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204275.61041: Set connection var ansible_timeout to 10 22736 1727204275.61055: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204275.61063: Set connection var ansible_shell_executable to /bin/sh 22736 1727204275.61066: Set connection var ansible_shell_type to sh 22736 1727204275.61072: Set connection var ansible_pipelining to False 22736 1727204275.61075: Set connection var ansible_connection to ssh 22736 1727204275.61098: variable 'ansible_shell_executable' from source: unknown 22736 1727204275.61101: variable 'ansible_connection' from source: unknown 22736 1727204275.61104: variable 'ansible_module_compression' from source: unknown 22736 1727204275.61108: variable 'ansible_shell_type' from source: unknown 22736 1727204275.61111: variable 'ansible_shell_executable' from source: unknown 22736 1727204275.61116: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204275.61121: variable 'ansible_pipelining' from source: unknown 22736 1727204275.61124: variable 'ansible_timeout' from source: unknown 22736 1727204275.61130: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204275.61302: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204275.61311: variable 'omit' from source: magic vars 22736 1727204275.61316: starting attempt loop 22736 1727204275.61322: running the handler 22736 1727204275.61335: _low_level_execute_command(): starting 22736 1727204275.61342: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204275.62133: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204275.62164: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204275.63998: stdout chunk (state=3): >>>/root <<< 22736 1727204275.64116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204275.64181: stderr chunk (state=3): >>><<< 22736 1727204275.64194: stdout chunk (state=3): >>><<< 22736 1727204275.64286: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204275.64292: _low_level_execute_command(): starting 22736 1727204275.64303: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204275.6422606-24859-138497821374237 `" && echo ansible-tmp-1727204275.6422606-24859-138497821374237="` echo /root/.ansible/tmp/ansible-tmp-1727204275.6422606-24859-138497821374237 `" ) && sleep 0' 22736 1727204275.64964: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204275.64990: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204275.64994: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204275.65010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204275.65020: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204275.65051: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204275.65055: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204275.65064: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204275.65143: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204275.65187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204275.67323: stdout chunk (state=3): >>>ansible-tmp-1727204275.6422606-24859-138497821374237=/root/.ansible/tmp/ansible-tmp-1727204275.6422606-24859-138497821374237 <<< 22736 1727204275.67449: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204275.67497: stderr chunk (state=3): >>><<< 22736 1727204275.67501: stdout chunk (state=3): >>><<< 22736 1727204275.67518: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204275.6422606-24859-138497821374237=/root/.ansible/tmp/ansible-tmp-1727204275.6422606-24859-138497821374237 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204275.67567: variable 'ansible_module_compression' from source: unknown 22736 1727204275.67608: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 22736 1727204275.67671: variable 'ansible_facts' from source: unknown 22736 1727204275.67779: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204275.6422606-24859-138497821374237/AnsiballZ_package_facts.py 22736 1727204275.67908: Sending initial data 22736 1727204275.67912: Sent initial data (162 bytes) 22736 1727204275.68384: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204275.68388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204275.68392: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204275.68395: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204275.68447: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204275.68451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204275.68497: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204275.70179: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204275.70239: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204275.70290: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpyq2luxo2 /root/.ansible/tmp/ansible-tmp-1727204275.6422606-24859-138497821374237/AnsiballZ_package_facts.py <<< 22736 1727204275.70296: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204275.6422606-24859-138497821374237/AnsiballZ_package_facts.py" <<< 22736 1727204275.70332: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpyq2luxo2" to remote "/root/.ansible/tmp/ansible-tmp-1727204275.6422606-24859-138497821374237/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204275.6422606-24859-138497821374237/AnsiballZ_package_facts.py" <<< 22736 1727204275.72710: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204275.73016: stderr chunk (state=3): >>><<< 22736 1727204275.73019: stdout chunk (state=3): >>><<< 22736 1727204275.73022: done transferring module to remote 22736 1727204275.73024: _low_level_execute_command(): starting 22736 1727204275.73027: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204275.6422606-24859-138497821374237/ /root/.ansible/tmp/ansible-tmp-1727204275.6422606-24859-138497821374237/AnsiballZ_package_facts.py && sleep 0' 22736 1727204275.73811: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204275.73849: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204275.73870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204275.73904: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204275.74013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204275.76065: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204275.76094: stderr chunk (state=3): >>><<< 22736 1727204275.76109: stdout chunk (state=3): >>><<< 22736 1727204275.76184: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204275.76188: _low_level_execute_command(): starting 22736 1727204275.76193: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204275.6422606-24859-138497821374237/AnsiballZ_package_facts.py && sleep 0' 22736 1727204275.76603: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204275.76618: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204275.76632: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204275.76720: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204275.76772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204276.41523: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "r<<< 22736 1727204276.41598: stdout chunk (state=3): >>>pm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-<<< 22736 1727204276.41670: stdout chunk (state=3): >>>libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb",<<< 22736 1727204276.41681: stdout chunk (state=3): >>> "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.<<< 22736 1727204276.41750: stdout chunk (state=3): >>>fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5<<< 22736 1727204276.41757: stdout chunk (state=3): >>>", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch<<< 22736 1727204276.41815: stdout chunk (state=3): >>>": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": nul<<< 22736 1727204276.41822: stdout chunk (state=3): >>>l, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 22736 1727204276.43802: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204276.43806: stdout chunk (state=3): >>><<< 22736 1727204276.43808: stderr chunk (state=3): >>><<< 22736 1727204276.43888: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "12.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.40", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-identity-basic": [{"name": "fedora-release-identity-basic", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-gpg-keys": [{"name": "fedora-gpg-keys", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-repos": [{"name": "fedora-repos", "version": "39", "release": "2", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release-common": [{"name": "fedora-release-common", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "fedora-release": [{"name": "fedora-release", "version": "39", "release": "36", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240101", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "abattis-cantarell-vf-fonts": [{"name": "abattis-cantarell-vf-fonts", "version": "0.301", "release": "10.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240909", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pkgconf-m4": [{"name": "pkgconf-m4", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "libreport-filesystem": [{"name": "libreport-filesystem", "version": "2.17.11", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-common": [{"name": "dhcp-common", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "noarch", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib": [{"name": "zlib", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.48", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.10.4", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nspr": [{"name": "nspr", "version": "4.35.0", "release": "22.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "5.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.47", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.42", "release": "1.fc39.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-util": [{"name": "nss-util", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.42.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.26", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "53.20240808cvs.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.1", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.3", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.8", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.9.0", "release": "6.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.15.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.9.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.5.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.14.0", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.55.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "protobuf-c": [{"name": "protobuf-c", "version": "1.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.44", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.3", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "7.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.6.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.1.0", "release": "4.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.14", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.10.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdb": [{"name": "libdb", "version": "5.3.28", "release": "56.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfs-3g-libs": [{"name": "ntfs-3g-libs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ntfs-3g": [{"name": "ntfs-3g", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libibverbs": [{"name": "libibverbs", "version": "46.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpcap": [{"name": "libpcap", "version": "1.10.4", "release": "2.fc39", "epoch": 14, "arch": "x86_64", "source": "rpm"}], "libb2": [{"name": "libb2", "version": "0.98.1", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbytesize": [{"name": "libbytesize", "version": "2.11", "release": "99.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano": [{"name": "nano", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nano-default-editor": [{"name": "nano-default-editor", "version": "7.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libatasmart": [{"name": "libatasmart", "version": "0.19", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn-freebl": [{"name": "nss-softokn-freebl", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-softokn": [{"name": "nss-softokn", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "less": [{"name": "less", "version": "633", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.28.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "exfatprogs": [{"name": "exfatprogs", "version": "1.2.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmaxminddb": [{"name": "libmaxminddb", "version": "1.11.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipcalc": [{"name": "ipcalc", "version": "1.0.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.33", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.6", "release": "17.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dosfstools": [{"name": "dosfstools", "version": "4.2", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.13.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libargon2": [{"name": "libargon2", "version": "20190702", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevdev": [{"name": "libevdev", "version": "1.13.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-core-libs": [{"name": "plymouth-core-libs", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.8", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "54.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpkgconf": [{"name": "libpkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf": [{"name": "pkgconf", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pkgconf-pkg-config": [{"name": "pkgconf-pkg-config", "version": "1.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-efi": [{"name": "fwupd-efi", "version": "1.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsigsegv": [{"name": "libsigsegv", "version": "2.14", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.2.2", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtool-ltdl": [{"name": "libtool-ltdl", "version": "2.4.7", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "btrfs-progs": [{"name": "btrfs-progs", "version": "6.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.4.0", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.4.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-pip-wheel": [{"name": "python-pip-wheel", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnsl2": [{"name": "libnsl2", "version": "2.0.0", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20231204", "release": "1.git1e3a2e4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.12", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.3", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2023.2.60_v7.0.306", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.0.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.5.3", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.6.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "30", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.13.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-networkd": [{"name": "systemd-networkd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-resolved": [{"name": "systemd-resolved", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "73.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.39.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "254.16", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.78.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-utils": [{"name": "libblockdev-utils", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "123", "release": "1.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "26.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-glib": [{"name": "json-glib", "version": "1.8.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgudev": [{"name": "libgudev", "version": "238", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgusb": [{"name": "libgusb", "version": "0.4.9", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmbim": [{"name": "libmbim", "version": "1.28.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shared-mime-info": [{"name": "shared-mime-info", "version": "2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxmlb": [{"name": "libxmlb", "version": "0.3.19", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.1", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zchunk-libs": [{"name": "zchunk-libs", "version": "1.5.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.8.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.5.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.0.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gpgme": [{"name": "gpgme", "version": "1.20.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libjcat": [{"name": "libjcat", "version": "0.2.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev": [{"name": "libblockdev", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-fs": [{"name": "libblockdev-fs", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-loop": [{"name": "libblockdev-loop", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-part": [{"name": "libblockdev-part", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-swap": [{"name": "libblockdev-swap", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bluez": [{"name": "bluez", "version": "5.77", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ModemManager-glib": [{"name": "ModemManager-glib", "version": "1.20.6", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.78.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libqrtr-glib": [{"name": "libqrtr-glib", "version": "1.2.2", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libqmi": [{"name": "libqmi", "version": "1.32.4", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libudisks2": [{"name": "libudisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuser": [{"name": "libuser", "version": "0.64", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ntfsprogs": [{"name": "ntfsprogs", "version": "2022.10.3", "release": "3.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.20", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20221126", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mdadm": [{"name": "mdadm", "version": "4.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-mdraid": [{"name": "libblockdev-mdraid", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator": [{"name": "zram-generator", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth": [{"name": "plymouth", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "plymouth-scripts": [{"name": "plymouth-scripts", "version": "24.004.60", "release": "12.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "1.p5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.7.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.30", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "deltarpm": [{"name": "deltarpm", "version": "3.6.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-anchor": [{"name": "unbound-anchor", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unbound-libs": [{"name": "unbound-libs", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-unbound": [{"name": "python3-unbound", "version": "1.20.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfsverity": [{"name": "libfsverity", "version": "1.4", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnvme": [{"name": "libnvme", "version": "1.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-nvme": [{"name": "libblockdev-nvme", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss-sysinit": [{"name": "nss-sysinit", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nss": [{"name": "nss", "version": "3.103.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "volume_key-libs": [{"name": "volume_key-libs", "version": "0.3.12", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libblockdev-crypto": [{"name": "libblockdev-crypto", "version": "3.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "udisks2": [{"name": "udisks2", "version": "2.10.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-modem-manager": [{"name": "fwupd-plugin-modem-manager", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd-plugin-uefi-capsule-data": [{"name": "fwupd-plugin-uefi-capsule-data", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fwupd": [{"name": "fwupd", "version": "1.9.21", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.9", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "40.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.2.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "39.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.20", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.8.0", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.7", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "12.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "10.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.0.5", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.21.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "121.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zram-generator-defaults": [{"name": "zram-generator-defaults", "version": "1.1.2", "release": "11.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "dhcp-client": [{"name": "dhcp-client", "version": "4.4.3", "release": "9.P1.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "passwd": [{"name": "passwd", "version": "0.80", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.44.2", "release": "1.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.3p1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "059", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.11.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.5", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.0", "release": "9.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "7.20230520.fc39.1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "34.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.58", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.88", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.188", "release": "501.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.21", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2023.0511", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "3.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.92", "release": "10.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20230801", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.10", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.083", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.13", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "4.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.037", "release": "3.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.15", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.28", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "500.fc39", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.52", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "500.fc39", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "501.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.13", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "500.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.54", "release": "500.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.77", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.89", "release": "500.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.16", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.19", "release": "500.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.54", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "500.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.38.2", "release": "502.fc39", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "boost-system": [{"name": "boost-system", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.40", "release": "14.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd-devel": [{"name": "libzstd-devel", "version": "1.5.6", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-devel": [{"name": "zlib-devel", "version": "1.2.13", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf-devel": [{"name": "elfutils-libelf-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "unzip": [{"name": "unzip", "version": "6.0", "release": "62.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "m4": [{"name": "m4", "version": "1.4.19", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmpc": [{"name": "libmpc", "version": "1.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "avahi-libs": [{"name": "avahi-libs", "version": "0.8", "release": "24.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "aspell": [{"name": "aspell", "version": "0.60.8", "release": "12.fc39", "epoch": 12, "arch": "x86_64", "source": "rpm"}], "cpp": [{"name": "cpp", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bison": [{"name": "bison", "version": "3.8.2", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "flex": [{"name": "flex", "version": "2.6.4", "release": "13.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zip": [{"name": "zip", "version": "3.0", "release": "39.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-thread": [{"name": "boost-thread", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "strace": [{"name": "strace", "version": "6.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-devel": [{"name": "openssl-devel", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-headers": [{"name": "kernel-headers", "version": "6.10.3", "release": "200.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-headers-x86": [{"name": "glibc-headers-x86", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-devel": [{"name": "glibc-devel", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt-devel": [{"name": "libxcrypt-devel", "version": "4.4.36", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client-devel": [{"name": "elfutils-debuginfod-client-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "efivar-libs": [{"name": "efivar-libs", "version": "39", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mokutil": [{"name": "mokutil", "version": "0.6.0", "release": "7.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "boost-atomic": [{"name": "boost-atomic", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "boost-filesystem": [{"name": "boost-filesystem", "version": "1.81.0", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-devel": [{"name": "xz-devel", "version": "5.4.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-devel": [{"name": "elfutils-devel", "version": "0.191", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tbb": [{"name": "tbb", "version": "2020.3", "release": "20.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dyninst": [{"name": "dyninst", "version": "12.3.0", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-runtime": [{"name": "systemtap-runtime", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-client": [{"name": "systemtap-client", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gc": [{"name": "gc", "version": "8.2.2", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "guile22": [{"name": "guile22", "version": "2.2.7", "release": "9.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "2.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gcc": [{"name": "gcc", "version": "13.3.1", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-devel": [{"name": "kernel-devel", "version": "6.10.10", "release": "100.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap-devel": [{"name": "systemtap-devel", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemtap": [{"name": "systemtap", "version": "5.2~pre17250223gd07e4284", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "qa-tools": [{"name": "qa-tools", "version": "4.1", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "aspell-en": [{"name": "aspell-en", "version": "2020.12.07", "release": "8.fc39", "epoch": 50, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "23.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "1.rc1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "502.fc39", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.43", "release": "502.fc39", "epoch": 0, "arch": "noarch", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.42", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "4.9.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsodium": [{"name": "libsodium", "version": "1.0.18", "release": "15.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-utils": [{"name": "dnf-utils", "version": "4.9.0", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "xxhash-libs": [{"name": "xxhash-libs", "version": "0.8.2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "2.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "3.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "18.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "13.fc39", "epoch": 1, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.46.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "libmetalink": [{"name": "libmetalink", "version": "0.1.3", "release": "32.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.21.4", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "8.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.1", "release": "6.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "0.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gpm-libs": [{"name": "gpm-libs", "version": "1.20.7", "release": "44.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "19.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.31.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.fc39eng", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.fc39eng", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.719", "release": "1.fc39", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.9.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.1.0", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.30.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.7.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "67.7.2", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.4.3", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "3.1.5", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.1.4", "release": "4.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "gdisk": [{"name": "gdisk", "version": "1.0.10", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "11.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pysocks": [{"name": "python3-pysocks", "version": "1.7.1", "release": "20.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3+socks": [{"name": "python3-urllib3+socks", "version": "1.26.19", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.28.2", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "6.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-netifaces": [{"name": "python3-netifaces", "version": "0.11.0", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "1.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.5", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "5.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "net-tools": [{"name": "net-tools", "version": "2.0", "release": "0.67.20160912git.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.5", "release": "3.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.5", "release": "8.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "16.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.197", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.12", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.23", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "3.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "gpg-pubkey": [{"name": "gpg-pubkey", "version": "18b8e74c", "release": "62f2920f", "epoch": null, "arch": null, "source": "rpm"}], "rtl-sdr": [{"name": "rtl-sdr", "version": "0.6.0^20230921git1261fbb2", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssl-pkcs11": [{"name": "openssl-pkcs11", "version": "0.4.12", "release": "4.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.4.1", "release": "5.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.16", "release": "2.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip": [{"name": "python3-pip", "version": "23.2.1", "release": "2.fc39", "epoch": null, "arch": "noarch", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "7.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "9.fc39", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "1.fc39", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204276.48039: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204275.6422606-24859-138497821374237/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204276.48060: _low_level_execute_command(): starting 22736 1727204276.48070: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204275.6422606-24859-138497821374237/ > /dev/null 2>&1 && sleep 0' 22736 1727204276.48738: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204276.48808: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204276.48877: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204276.48907: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204276.48973: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204276.51006: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204276.51099: stderr chunk (state=3): >>><<< 22736 1727204276.51113: stdout chunk (state=3): >>><<< 22736 1727204276.51135: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204276.51148: handler run complete 22736 1727204276.52779: variable 'ansible_facts' from source: unknown 22736 1727204276.53694: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204276.57581: variable 'ansible_facts' from source: unknown 22736 1727204276.58510: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204276.60054: attempt loop complete, returning result 22736 1727204276.60057: _execute() done 22736 1727204276.60060: dumping result to json 22736 1727204276.60415: done dumping result, returning 22736 1727204276.60435: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check which packages are installed [12b410aa-8751-4f4a-548a-00000000045e] 22736 1727204276.60444: sending task result for task 12b410aa-8751-4f4a-548a-00000000045e 22736 1727204276.64246: done sending task result for task 12b410aa-8751-4f4a-548a-00000000045e 22736 1727204276.64250: WORKER PROCESS EXITING ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22736 1727204276.64423: no more pending results, returning what we have 22736 1727204276.64427: results queue empty 22736 1727204276.64428: checking for any_errors_fatal 22736 1727204276.64435: done checking for any_errors_fatal 22736 1727204276.64436: checking for max_fail_percentage 22736 1727204276.64438: done checking for max_fail_percentage 22736 1727204276.64439: checking to see if all hosts have failed and the running result is not ok 22736 1727204276.64440: done checking to see if all hosts have failed 22736 1727204276.64441: getting the remaining hosts for this loop 22736 1727204276.64442: done getting the remaining hosts for this loop 22736 1727204276.64447: getting the next task for host managed-node2 22736 1727204276.64454: done getting next task for host managed-node2 22736 1727204276.64458: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 22736 1727204276.64461: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204276.64472: getting variables 22736 1727204276.64473: in VariableManager get_vars() 22736 1727204276.64513: Calling all_inventory to load vars for managed-node2 22736 1727204276.64519: Calling groups_inventory to load vars for managed-node2 22736 1727204276.64522: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204276.64533: Calling all_plugins_play to load vars for managed-node2 22736 1727204276.64536: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204276.64540: Calling groups_plugins_play to load vars for managed-node2 22736 1727204276.66777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204276.69913: done with get_vars() 22736 1727204276.69959: done getting variables 22736 1727204276.70035: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Tuesday 24 September 2024 14:57:56 -0400 (0:00:01.104) 0:00:41.485 ***** 22736 1727204276.70078: entering _queue_task() for managed-node2/debug 22736 1727204276.70516: worker is 1 (out of 1 available) 22736 1727204276.70531: exiting _queue_task() for managed-node2/debug 22736 1727204276.70545: done queuing things up, now waiting for results queue to drain 22736 1727204276.70546: waiting for pending results... 22736 1727204276.70824: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider 22736 1727204276.70956: in run() - task 12b410aa-8751-4f4a-548a-00000000005d 22736 1727204276.70979: variable 'ansible_search_path' from source: unknown 22736 1727204276.70986: variable 'ansible_search_path' from source: unknown 22736 1727204276.71044: calling self._execute() 22736 1727204276.71161: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204276.71175: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204276.71194: variable 'omit' from source: magic vars 22736 1727204276.71683: variable 'ansible_distribution_major_version' from source: facts 22736 1727204276.71688: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204276.71700: variable 'omit' from source: magic vars 22736 1727204276.71756: variable 'omit' from source: magic vars 22736 1727204276.72010: variable 'network_provider' from source: set_fact 22736 1727204276.72014: variable 'omit' from source: magic vars 22736 1727204276.72017: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204276.72030: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204276.72060: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204276.72088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204276.72114: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204276.72235: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204276.72239: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204276.72241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204276.72320: Set connection var ansible_timeout to 10 22736 1727204276.72348: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204276.72364: Set connection var ansible_shell_executable to /bin/sh 22736 1727204276.72372: Set connection var ansible_shell_type to sh 22736 1727204276.72383: Set connection var ansible_pipelining to False 22736 1727204276.72393: Set connection var ansible_connection to ssh 22736 1727204276.72427: variable 'ansible_shell_executable' from source: unknown 22736 1727204276.72436: variable 'ansible_connection' from source: unknown 22736 1727204276.72449: variable 'ansible_module_compression' from source: unknown 22736 1727204276.72460: variable 'ansible_shell_type' from source: unknown 22736 1727204276.72468: variable 'ansible_shell_executable' from source: unknown 22736 1727204276.72475: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204276.72562: variable 'ansible_pipelining' from source: unknown 22736 1727204276.72566: variable 'ansible_timeout' from source: unknown 22736 1727204276.72568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204276.72692: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204276.72712: variable 'omit' from source: magic vars 22736 1727204276.72725: starting attempt loop 22736 1727204276.72733: running the handler 22736 1727204276.72798: handler run complete 22736 1727204276.72824: attempt loop complete, returning result 22736 1727204276.72832: _execute() done 22736 1727204276.72841: dumping result to json 22736 1727204276.72849: done dumping result, returning 22736 1727204276.72860: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Print network provider [12b410aa-8751-4f4a-548a-00000000005d] 22736 1727204276.72869: sending task result for task 12b410aa-8751-4f4a-548a-00000000005d ok: [managed-node2] => {} MSG: Using network provider: nm 22736 1727204276.73065: no more pending results, returning what we have 22736 1727204276.73070: results queue empty 22736 1727204276.73072: checking for any_errors_fatal 22736 1727204276.73085: done checking for any_errors_fatal 22736 1727204276.73087: checking for max_fail_percentage 22736 1727204276.73088: done checking for max_fail_percentage 22736 1727204276.73093: checking to see if all hosts have failed and the running result is not ok 22736 1727204276.73094: done checking to see if all hosts have failed 22736 1727204276.73095: getting the remaining hosts for this loop 22736 1727204276.73096: done getting the remaining hosts for this loop 22736 1727204276.73102: getting the next task for host managed-node2 22736 1727204276.73110: done getting next task for host managed-node2 22736 1727204276.73115: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22736 1727204276.73120: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204276.73132: getting variables 22736 1727204276.73135: in VariableManager get_vars() 22736 1727204276.73178: Calling all_inventory to load vars for managed-node2 22736 1727204276.73182: Calling groups_inventory to load vars for managed-node2 22736 1727204276.73185: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204276.73414: Calling all_plugins_play to load vars for managed-node2 22736 1727204276.73421: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204276.73428: done sending task result for task 12b410aa-8751-4f4a-548a-00000000005d 22736 1727204276.73432: WORKER PROCESS EXITING 22736 1727204276.73436: Calling groups_plugins_play to load vars for managed-node2 22736 1727204276.75782: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204276.79049: done with get_vars() 22736 1727204276.79087: done getting variables 22736 1727204276.79168: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Tuesday 24 September 2024 14:57:56 -0400 (0:00:00.091) 0:00:41.576 ***** 22736 1727204276.79207: entering _queue_task() for managed-node2/fail 22736 1727204276.79802: worker is 1 (out of 1 available) 22736 1727204276.79814: exiting _queue_task() for managed-node2/fail 22736 1727204276.79828: done queuing things up, now waiting for results queue to drain 22736 1727204276.79829: waiting for pending results... 22736 1727204276.79947: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 22736 1727204276.80180: in run() - task 12b410aa-8751-4f4a-548a-00000000005e 22736 1727204276.80185: variable 'ansible_search_path' from source: unknown 22736 1727204276.80187: variable 'ansible_search_path' from source: unknown 22736 1727204276.80192: calling self._execute() 22736 1727204276.80271: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204276.80294: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204276.80311: variable 'omit' from source: magic vars 22736 1727204276.80779: variable 'ansible_distribution_major_version' from source: facts 22736 1727204276.80800: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204276.80983: variable 'network_state' from source: role '' defaults 22736 1727204276.81004: Evaluated conditional (network_state != {}): False 22736 1727204276.81013: when evaluation is False, skipping this task 22736 1727204276.81024: _execute() done 22736 1727204276.81033: dumping result to json 22736 1727204276.81041: done dumping result, returning 22736 1727204276.81060: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12b410aa-8751-4f4a-548a-00000000005e] 22736 1727204276.81071: sending task result for task 12b410aa-8751-4f4a-548a-00000000005e 22736 1727204276.81423: done sending task result for task 12b410aa-8751-4f4a-548a-00000000005e 22736 1727204276.81427: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22736 1727204276.81474: no more pending results, returning what we have 22736 1727204276.81478: results queue empty 22736 1727204276.81479: checking for any_errors_fatal 22736 1727204276.81485: done checking for any_errors_fatal 22736 1727204276.81486: checking for max_fail_percentage 22736 1727204276.81488: done checking for max_fail_percentage 22736 1727204276.81491: checking to see if all hosts have failed and the running result is not ok 22736 1727204276.81493: done checking to see if all hosts have failed 22736 1727204276.81494: getting the remaining hosts for this loop 22736 1727204276.81495: done getting the remaining hosts for this loop 22736 1727204276.81499: getting the next task for host managed-node2 22736 1727204276.81505: done getting next task for host managed-node2 22736 1727204276.81509: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22736 1727204276.81512: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204276.81530: getting variables 22736 1727204276.81531: in VariableManager get_vars() 22736 1727204276.81571: Calling all_inventory to load vars for managed-node2 22736 1727204276.81575: Calling groups_inventory to load vars for managed-node2 22736 1727204276.81578: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204276.81594: Calling all_plugins_play to load vars for managed-node2 22736 1727204276.81598: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204276.81603: Calling groups_plugins_play to load vars for managed-node2 22736 1727204276.83883: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204276.86520: done with get_vars() 22736 1727204276.86551: done getting variables 22736 1727204276.86605: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Tuesday 24 September 2024 14:57:56 -0400 (0:00:00.074) 0:00:41.651 ***** 22736 1727204276.86636: entering _queue_task() for managed-node2/fail 22736 1727204276.86911: worker is 1 (out of 1 available) 22736 1727204276.86926: exiting _queue_task() for managed-node2/fail 22736 1727204276.86939: done queuing things up, now waiting for results queue to drain 22736 1727204276.86941: waiting for pending results... 22736 1727204276.87141: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 22736 1727204276.87231: in run() - task 12b410aa-8751-4f4a-548a-00000000005f 22736 1727204276.87244: variable 'ansible_search_path' from source: unknown 22736 1727204276.87248: variable 'ansible_search_path' from source: unknown 22736 1727204276.87285: calling self._execute() 22736 1727204276.87366: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204276.87373: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204276.87390: variable 'omit' from source: magic vars 22736 1727204276.87708: variable 'ansible_distribution_major_version' from source: facts 22736 1727204276.87722: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204276.87830: variable 'network_state' from source: role '' defaults 22736 1727204276.87842: Evaluated conditional (network_state != {}): False 22736 1727204276.87846: when evaluation is False, skipping this task 22736 1727204276.87849: _execute() done 22736 1727204276.87853: dumping result to json 22736 1727204276.87867: done dumping result, returning 22736 1727204276.87891: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12b410aa-8751-4f4a-548a-00000000005f] 22736 1727204276.87894: sending task result for task 12b410aa-8751-4f4a-548a-00000000005f 22736 1727204276.87988: done sending task result for task 12b410aa-8751-4f4a-548a-00000000005f skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22736 1727204276.88147: no more pending results, returning what we have 22736 1727204276.88151: results queue empty 22736 1727204276.88152: checking for any_errors_fatal 22736 1727204276.88157: done checking for any_errors_fatal 22736 1727204276.88158: checking for max_fail_percentage 22736 1727204276.88160: done checking for max_fail_percentage 22736 1727204276.88160: checking to see if all hosts have failed and the running result is not ok 22736 1727204276.88161: done checking to see if all hosts have failed 22736 1727204276.88162: getting the remaining hosts for this loop 22736 1727204276.88164: done getting the remaining hosts for this loop 22736 1727204276.88167: getting the next task for host managed-node2 22736 1727204276.88173: done getting next task for host managed-node2 22736 1727204276.88177: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22736 1727204276.88179: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204276.88195: getting variables 22736 1727204276.88196: in VariableManager get_vars() 22736 1727204276.88234: Calling all_inventory to load vars for managed-node2 22736 1727204276.88238: Calling groups_inventory to load vars for managed-node2 22736 1727204276.88240: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204276.88251: Calling all_plugins_play to load vars for managed-node2 22736 1727204276.88254: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204276.88259: Calling groups_plugins_play to load vars for managed-node2 22736 1727204276.88806: WORKER PROCESS EXITING 22736 1727204276.93920: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204276.96450: done with get_vars() 22736 1727204276.96475: done getting variables 22736 1727204276.96533: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Tuesday 24 September 2024 14:57:56 -0400 (0:00:00.099) 0:00:41.750 ***** 22736 1727204276.96562: entering _queue_task() for managed-node2/fail 22736 1727204276.96923: worker is 1 (out of 1 available) 22736 1727204276.96938: exiting _queue_task() for managed-node2/fail 22736 1727204276.96952: done queuing things up, now waiting for results queue to drain 22736 1727204276.96954: waiting for pending results... 22736 1727204276.97210: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 22736 1727204276.97357: in run() - task 12b410aa-8751-4f4a-548a-000000000060 22736 1727204276.97380: variable 'ansible_search_path' from source: unknown 22736 1727204276.97387: variable 'ansible_search_path' from source: unknown 22736 1727204276.97450: calling self._execute() 22736 1727204276.97595: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204276.97600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204276.97603: variable 'omit' from source: magic vars 22736 1727204276.98070: variable 'ansible_distribution_major_version' from source: facts 22736 1727204276.98097: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204276.98245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204277.00200: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204277.00205: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204277.00247: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204277.00298: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204277.00334: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204277.00438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.00480: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.00528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.00587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.00616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.00742: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.00766: Evaluated conditional (ansible_distribution_major_version | int > 9): True 22736 1727204277.00920: variable 'ansible_distribution' from source: facts 22736 1727204277.00931: variable '__network_rh_distros' from source: role '' defaults 22736 1727204277.00951: Evaluated conditional (ansible_distribution in __network_rh_distros): False 22736 1727204277.00964: when evaluation is False, skipping this task 22736 1727204277.00974: _execute() done 22736 1727204277.00977: dumping result to json 22736 1727204277.00980: done dumping result, returning 22736 1727204277.00988: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12b410aa-8751-4f4a-548a-000000000060] 22736 1727204277.00993: sending task result for task 12b410aa-8751-4f4a-548a-000000000060 22736 1727204277.01101: done sending task result for task 12b410aa-8751-4f4a-548a-000000000060 22736 1727204277.01105: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution in __network_rh_distros", "skip_reason": "Conditional result was False" } 22736 1727204277.01156: no more pending results, returning what we have 22736 1727204277.01161: results queue empty 22736 1727204277.01162: checking for any_errors_fatal 22736 1727204277.01172: done checking for any_errors_fatal 22736 1727204277.01173: checking for max_fail_percentage 22736 1727204277.01175: done checking for max_fail_percentage 22736 1727204277.01176: checking to see if all hosts have failed and the running result is not ok 22736 1727204277.01177: done checking to see if all hosts have failed 22736 1727204277.01178: getting the remaining hosts for this loop 22736 1727204277.01180: done getting the remaining hosts for this loop 22736 1727204277.01186: getting the next task for host managed-node2 22736 1727204277.01195: done getting next task for host managed-node2 22736 1727204277.01199: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22736 1727204277.01201: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204277.01217: getting variables 22736 1727204277.01220: in VariableManager get_vars() 22736 1727204277.01262: Calling all_inventory to load vars for managed-node2 22736 1727204277.01265: Calling groups_inventory to load vars for managed-node2 22736 1727204277.01268: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204277.01279: Calling all_plugins_play to load vars for managed-node2 22736 1727204277.01282: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204277.01285: Calling groups_plugins_play to load vars for managed-node2 22736 1727204277.02529: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204277.04105: done with get_vars() 22736 1727204277.04131: done getting variables 22736 1727204277.04180: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.076) 0:00:41.826 ***** 22736 1727204277.04207: entering _queue_task() for managed-node2/dnf 22736 1727204277.04466: worker is 1 (out of 1 available) 22736 1727204277.04481: exiting _queue_task() for managed-node2/dnf 22736 1727204277.04496: done queuing things up, now waiting for results queue to drain 22736 1727204277.04498: waiting for pending results... 22736 1727204277.04694: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 22736 1727204277.04785: in run() - task 12b410aa-8751-4f4a-548a-000000000061 22736 1727204277.04799: variable 'ansible_search_path' from source: unknown 22736 1727204277.04803: variable 'ansible_search_path' from source: unknown 22736 1727204277.04847: calling self._execute() 22736 1727204277.04922: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204277.04931: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204277.04948: variable 'omit' from source: magic vars 22736 1727204277.05277: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.05284: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204277.05460: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204277.07464: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204277.07527: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204277.07558: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204277.07592: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204277.07615: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204277.07688: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.07713: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.07739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.07773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.07787: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.07887: variable 'ansible_distribution' from source: facts 22736 1727204277.07892: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.07905: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 22736 1727204277.07995: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204277.08112: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.08137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.08158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.08191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.08216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.08249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.08269: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.08288: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.08324: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.08338: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.08375: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.08401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.08424: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.08460: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.08474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.08603: variable 'network_connections' from source: play vars 22736 1727204277.08613: variable 'profile' from source: play vars 22736 1727204277.08677: variable 'profile' from source: play vars 22736 1727204277.08682: variable 'interface' from source: set_fact 22736 1727204277.08738: variable 'interface' from source: set_fact 22736 1727204277.08802: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204277.08937: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204277.08969: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204277.09010: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204277.09038: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204277.09074: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204277.09098: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204277.09126: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.09148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204277.09191: variable '__network_team_connections_defined' from source: role '' defaults 22736 1727204277.09393: variable 'network_connections' from source: play vars 22736 1727204277.09398: variable 'profile' from source: play vars 22736 1727204277.09454: variable 'profile' from source: play vars 22736 1727204277.09458: variable 'interface' from source: set_fact 22736 1727204277.09509: variable 'interface' from source: set_fact 22736 1727204277.09535: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22736 1727204277.09539: when evaluation is False, skipping this task 22736 1727204277.09543: _execute() done 22736 1727204277.09546: dumping result to json 22736 1727204277.09548: done dumping result, returning 22736 1727204277.09595: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12b410aa-8751-4f4a-548a-000000000061] 22736 1727204277.09598: sending task result for task 12b410aa-8751-4f4a-548a-000000000061 22736 1727204277.09665: done sending task result for task 12b410aa-8751-4f4a-548a-000000000061 22736 1727204277.09668: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22736 1727204277.09721: no more pending results, returning what we have 22736 1727204277.09725: results queue empty 22736 1727204277.09726: checking for any_errors_fatal 22736 1727204277.09733: done checking for any_errors_fatal 22736 1727204277.09734: checking for max_fail_percentage 22736 1727204277.09736: done checking for max_fail_percentage 22736 1727204277.09737: checking to see if all hosts have failed and the running result is not ok 22736 1727204277.09738: done checking to see if all hosts have failed 22736 1727204277.09738: getting the remaining hosts for this loop 22736 1727204277.09740: done getting the remaining hosts for this loop 22736 1727204277.09745: getting the next task for host managed-node2 22736 1727204277.09750: done getting next task for host managed-node2 22736 1727204277.09755: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22736 1727204277.09757: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204277.09772: getting variables 22736 1727204277.09774: in VariableManager get_vars() 22736 1727204277.09816: Calling all_inventory to load vars for managed-node2 22736 1727204277.09819: Calling groups_inventory to load vars for managed-node2 22736 1727204277.09822: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204277.09833: Calling all_plugins_play to load vars for managed-node2 22736 1727204277.09836: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204277.09839: Calling groups_plugins_play to load vars for managed-node2 22736 1727204277.11162: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204277.13893: done with get_vars() 22736 1727204277.13933: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 22736 1727204277.14044: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.098) 0:00:41.925 ***** 22736 1727204277.14093: entering _queue_task() for managed-node2/yum 22736 1727204277.14430: worker is 1 (out of 1 available) 22736 1727204277.14445: exiting _queue_task() for managed-node2/yum 22736 1727204277.14460: done queuing things up, now waiting for results queue to drain 22736 1727204277.14462: waiting for pending results... 22736 1727204277.14909: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 22736 1727204277.14974: in run() - task 12b410aa-8751-4f4a-548a-000000000062 22736 1727204277.15003: variable 'ansible_search_path' from source: unknown 22736 1727204277.15015: variable 'ansible_search_path' from source: unknown 22736 1727204277.15096: calling self._execute() 22736 1727204277.15209: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204277.15263: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204277.15269: variable 'omit' from source: magic vars 22736 1727204277.15777: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.15808: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204277.16056: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204277.17820: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204277.17883: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204277.17920: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204277.17952: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204277.17974: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204277.18050: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.18074: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.18100: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.18138: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.18151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.18232: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.18245: Evaluated conditional (ansible_distribution_major_version | int < 8): False 22736 1727204277.18248: when evaluation is False, skipping this task 22736 1727204277.18251: _execute() done 22736 1727204277.18256: dumping result to json 22736 1727204277.18260: done dumping result, returning 22736 1727204277.18267: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12b410aa-8751-4f4a-548a-000000000062] 22736 1727204277.18270: sending task result for task 12b410aa-8751-4f4a-548a-000000000062 22736 1727204277.18367: done sending task result for task 12b410aa-8751-4f4a-548a-000000000062 22736 1727204277.18370: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 22736 1727204277.18424: no more pending results, returning what we have 22736 1727204277.18429: results queue empty 22736 1727204277.18430: checking for any_errors_fatal 22736 1727204277.18438: done checking for any_errors_fatal 22736 1727204277.18439: checking for max_fail_percentage 22736 1727204277.18440: done checking for max_fail_percentage 22736 1727204277.18441: checking to see if all hosts have failed and the running result is not ok 22736 1727204277.18442: done checking to see if all hosts have failed 22736 1727204277.18443: getting the remaining hosts for this loop 22736 1727204277.18445: done getting the remaining hosts for this loop 22736 1727204277.18449: getting the next task for host managed-node2 22736 1727204277.18456: done getting next task for host managed-node2 22736 1727204277.18460: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22736 1727204277.18462: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204277.18476: getting variables 22736 1727204277.18478: in VariableManager get_vars() 22736 1727204277.18523: Calling all_inventory to load vars for managed-node2 22736 1727204277.18526: Calling groups_inventory to load vars for managed-node2 22736 1727204277.18529: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204277.18539: Calling all_plugins_play to load vars for managed-node2 22736 1727204277.18543: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204277.18546: Calling groups_plugins_play to load vars for managed-node2 22736 1727204277.19843: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204277.21442: done with get_vars() 22736 1727204277.21465: done getting variables 22736 1727204277.21516: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.074) 0:00:42.000 ***** 22736 1727204277.21546: entering _queue_task() for managed-node2/fail 22736 1727204277.21806: worker is 1 (out of 1 available) 22736 1727204277.21822: exiting _queue_task() for managed-node2/fail 22736 1727204277.21836: done queuing things up, now waiting for results queue to drain 22736 1727204277.21838: waiting for pending results... 22736 1727204277.22035: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 22736 1727204277.22121: in run() - task 12b410aa-8751-4f4a-548a-000000000063 22736 1727204277.22131: variable 'ansible_search_path' from source: unknown 22736 1727204277.22136: variable 'ansible_search_path' from source: unknown 22736 1727204277.22172: calling self._execute() 22736 1727204277.22255: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204277.22262: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204277.22272: variable 'omit' from source: magic vars 22736 1727204277.22603: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.22623: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204277.22723: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204277.22897: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204277.24660: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204277.24730: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204277.24762: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204277.24812: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204277.24923: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204277.24928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.24931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.24946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.24979: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.24993: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.25041: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.25062: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.25082: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.25115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.25130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.25170: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.25192: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.25212: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.25249: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.25258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.25410: variable 'network_connections' from source: play vars 22736 1727204277.25469: variable 'profile' from source: play vars 22736 1727204277.25498: variable 'profile' from source: play vars 22736 1727204277.25501: variable 'interface' from source: set_fact 22736 1727204277.25558: variable 'interface' from source: set_fact 22736 1727204277.25625: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204277.25769: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204277.25805: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204277.25834: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204277.25859: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204277.25901: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204277.25994: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204277.25998: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.26001: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204277.26006: variable '__network_team_connections_defined' from source: role '' defaults 22736 1727204277.26218: variable 'network_connections' from source: play vars 22736 1727204277.26225: variable 'profile' from source: play vars 22736 1727204277.26280: variable 'profile' from source: play vars 22736 1727204277.26284: variable 'interface' from source: set_fact 22736 1727204277.26340: variable 'interface' from source: set_fact 22736 1727204277.26365: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22736 1727204277.26368: when evaluation is False, skipping this task 22736 1727204277.26371: _execute() done 22736 1727204277.26376: dumping result to json 22736 1727204277.26380: done dumping result, returning 22736 1727204277.26388: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12b410aa-8751-4f4a-548a-000000000063] 22736 1727204277.26401: sending task result for task 12b410aa-8751-4f4a-548a-000000000063 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22736 1727204277.26559: no more pending results, returning what we have 22736 1727204277.26562: results queue empty 22736 1727204277.26563: checking for any_errors_fatal 22736 1727204277.26570: done checking for any_errors_fatal 22736 1727204277.26571: checking for max_fail_percentage 22736 1727204277.26572: done checking for max_fail_percentage 22736 1727204277.26573: checking to see if all hosts have failed and the running result is not ok 22736 1727204277.26574: done checking to see if all hosts have failed 22736 1727204277.26575: getting the remaining hosts for this loop 22736 1727204277.26577: done getting the remaining hosts for this loop 22736 1727204277.26581: getting the next task for host managed-node2 22736 1727204277.26588: done getting next task for host managed-node2 22736 1727204277.26594: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 22736 1727204277.26597: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204277.26614: getting variables 22736 1727204277.26616: in VariableManager get_vars() 22736 1727204277.26661: Calling all_inventory to load vars for managed-node2 22736 1727204277.26664: Calling groups_inventory to load vars for managed-node2 22736 1727204277.26667: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204277.26678: Calling all_plugins_play to load vars for managed-node2 22736 1727204277.26681: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204277.26684: Calling groups_plugins_play to load vars for managed-node2 22736 1727204277.26703: done sending task result for task 12b410aa-8751-4f4a-548a-000000000063 22736 1727204277.26706: WORKER PROCESS EXITING 22736 1727204277.28008: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204277.29611: done with get_vars() 22736 1727204277.29649: done getting variables 22736 1727204277.29708: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.081) 0:00:42.082 ***** 22736 1727204277.29741: entering _queue_task() for managed-node2/package 22736 1727204277.30031: worker is 1 (out of 1 available) 22736 1727204277.30047: exiting _queue_task() for managed-node2/package 22736 1727204277.30061: done queuing things up, now waiting for results queue to drain 22736 1727204277.30063: waiting for pending results... 22736 1727204277.30259: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages 22736 1727204277.30348: in run() - task 12b410aa-8751-4f4a-548a-000000000064 22736 1727204277.30362: variable 'ansible_search_path' from source: unknown 22736 1727204277.30367: variable 'ansible_search_path' from source: unknown 22736 1727204277.30410: calling self._execute() 22736 1727204277.30486: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204277.30494: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204277.30506: variable 'omit' from source: magic vars 22736 1727204277.30833: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.30849: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204277.31016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204277.31256: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204277.31299: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204277.31330: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204277.31668: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204277.31765: variable 'network_packages' from source: role '' defaults 22736 1727204277.31860: variable '__network_provider_setup' from source: role '' defaults 22736 1727204277.31873: variable '__network_service_name_default_nm' from source: role '' defaults 22736 1727204277.31939: variable '__network_service_name_default_nm' from source: role '' defaults 22736 1727204277.31948: variable '__network_packages_default_nm' from source: role '' defaults 22736 1727204277.32001: variable '__network_packages_default_nm' from source: role '' defaults 22736 1727204277.32160: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204277.33728: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204277.33784: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204277.33820: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204277.33847: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204277.33869: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204277.33943: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.33977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.34001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.34039: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.34051: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.34092: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.34113: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.34139: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.34169: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.34181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.34377: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22736 1727204277.34477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.34499: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.34522: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.34555: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.34570: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.34651: variable 'ansible_python' from source: facts 22736 1727204277.34674: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22736 1727204277.34745: variable '__network_wpa_supplicant_required' from source: role '' defaults 22736 1727204277.34816: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22736 1727204277.34925: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.34945: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.34965: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.35002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.35015: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.35054: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.35079: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.35102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.35137: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.35149: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.35272: variable 'network_connections' from source: play vars 22736 1727204277.35278: variable 'profile' from source: play vars 22736 1727204277.35363: variable 'profile' from source: play vars 22736 1727204277.35369: variable 'interface' from source: set_fact 22736 1727204277.35430: variable 'interface' from source: set_fact 22736 1727204277.35487: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204277.35511: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204277.35538: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.35567: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204277.35608: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204277.35841: variable 'network_connections' from source: play vars 22736 1727204277.35846: variable 'profile' from source: play vars 22736 1727204277.35929: variable 'profile' from source: play vars 22736 1727204277.35936: variable 'interface' from source: set_fact 22736 1727204277.35999: variable 'interface' from source: set_fact 22736 1727204277.36026: variable '__network_packages_default_wireless' from source: role '' defaults 22736 1727204277.36093: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204277.36341: variable 'network_connections' from source: play vars 22736 1727204277.36345: variable 'profile' from source: play vars 22736 1727204277.36404: variable 'profile' from source: play vars 22736 1727204277.36407: variable 'interface' from source: set_fact 22736 1727204277.36487: variable 'interface' from source: set_fact 22736 1727204277.36511: variable '__network_packages_default_team' from source: role '' defaults 22736 1727204277.36578: variable '__network_team_connections_defined' from source: role '' defaults 22736 1727204277.36831: variable 'network_connections' from source: play vars 22736 1727204277.36834: variable 'profile' from source: play vars 22736 1727204277.36892: variable 'profile' from source: play vars 22736 1727204277.36895: variable 'interface' from source: set_fact 22736 1727204277.36975: variable 'interface' from source: set_fact 22736 1727204277.37024: variable '__network_service_name_default_initscripts' from source: role '' defaults 22736 1727204277.37075: variable '__network_service_name_default_initscripts' from source: role '' defaults 22736 1727204277.37082: variable '__network_packages_default_initscripts' from source: role '' defaults 22736 1727204277.37134: variable '__network_packages_default_initscripts' from source: role '' defaults 22736 1727204277.37322: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22736 1727204277.37705: variable 'network_connections' from source: play vars 22736 1727204277.37710: variable 'profile' from source: play vars 22736 1727204277.37764: variable 'profile' from source: play vars 22736 1727204277.37767: variable 'interface' from source: set_fact 22736 1727204277.37824: variable 'interface' from source: set_fact 22736 1727204277.37833: variable 'ansible_distribution' from source: facts 22736 1727204277.37836: variable '__network_rh_distros' from source: role '' defaults 22736 1727204277.37844: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.37857: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22736 1727204277.38000: variable 'ansible_distribution' from source: facts 22736 1727204277.38003: variable '__network_rh_distros' from source: role '' defaults 22736 1727204277.38010: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.38017: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22736 1727204277.38158: variable 'ansible_distribution' from source: facts 22736 1727204277.38163: variable '__network_rh_distros' from source: role '' defaults 22736 1727204277.38166: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.38198: variable 'network_provider' from source: set_fact 22736 1727204277.38215: variable 'ansible_facts' from source: unknown 22736 1727204277.38834: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 22736 1727204277.38838: when evaluation is False, skipping this task 22736 1727204277.38841: _execute() done 22736 1727204277.38844: dumping result to json 22736 1727204277.38846: done dumping result, returning 22736 1727204277.38854: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install packages [12b410aa-8751-4f4a-548a-000000000064] 22736 1727204277.38858: sending task result for task 12b410aa-8751-4f4a-548a-000000000064 22736 1727204277.38961: done sending task result for task 12b410aa-8751-4f4a-548a-000000000064 22736 1727204277.38965: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 22736 1727204277.39023: no more pending results, returning what we have 22736 1727204277.39027: results queue empty 22736 1727204277.39028: checking for any_errors_fatal 22736 1727204277.39036: done checking for any_errors_fatal 22736 1727204277.39037: checking for max_fail_percentage 22736 1727204277.39038: done checking for max_fail_percentage 22736 1727204277.39039: checking to see if all hosts have failed and the running result is not ok 22736 1727204277.39040: done checking to see if all hosts have failed 22736 1727204277.39041: getting the remaining hosts for this loop 22736 1727204277.39043: done getting the remaining hosts for this loop 22736 1727204277.39047: getting the next task for host managed-node2 22736 1727204277.39054: done getting next task for host managed-node2 22736 1727204277.39058: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22736 1727204277.39060: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204277.39076: getting variables 22736 1727204277.39078: in VariableManager get_vars() 22736 1727204277.39125: Calling all_inventory to load vars for managed-node2 22736 1727204277.39128: Calling groups_inventory to load vars for managed-node2 22736 1727204277.39130: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204277.39147: Calling all_plugins_play to load vars for managed-node2 22736 1727204277.39151: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204277.39154: Calling groups_plugins_play to load vars for managed-node2 22736 1727204277.40540: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204277.42109: done with get_vars() 22736 1727204277.42140: done getting variables 22736 1727204277.42192: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.124) 0:00:42.207 ***** 22736 1727204277.42219: entering _queue_task() for managed-node2/package 22736 1727204277.42492: worker is 1 (out of 1 available) 22736 1727204277.42509: exiting _queue_task() for managed-node2/package 22736 1727204277.42526: done queuing things up, now waiting for results queue to drain 22736 1727204277.42528: waiting for pending results... 22736 1727204277.42711: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 22736 1727204277.42794: in run() - task 12b410aa-8751-4f4a-548a-000000000065 22736 1727204277.42828: variable 'ansible_search_path' from source: unknown 22736 1727204277.42833: variable 'ansible_search_path' from source: unknown 22736 1727204277.42858: calling self._execute() 22736 1727204277.42949: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204277.42956: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204277.42965: variable 'omit' from source: magic vars 22736 1727204277.43292: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.43309: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204277.43409: variable 'network_state' from source: role '' defaults 22736 1727204277.43422: Evaluated conditional (network_state != {}): False 22736 1727204277.43425: when evaluation is False, skipping this task 22736 1727204277.43428: _execute() done 22736 1727204277.43431: dumping result to json 22736 1727204277.43436: done dumping result, returning 22736 1727204277.43444: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12b410aa-8751-4f4a-548a-000000000065] 22736 1727204277.43450: sending task result for task 12b410aa-8751-4f4a-548a-000000000065 22736 1727204277.43553: done sending task result for task 12b410aa-8751-4f4a-548a-000000000065 22736 1727204277.43556: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22736 1727204277.43608: no more pending results, returning what we have 22736 1727204277.43611: results queue empty 22736 1727204277.43613: checking for any_errors_fatal 22736 1727204277.43619: done checking for any_errors_fatal 22736 1727204277.43620: checking for max_fail_percentage 22736 1727204277.43622: done checking for max_fail_percentage 22736 1727204277.43623: checking to see if all hosts have failed and the running result is not ok 22736 1727204277.43624: done checking to see if all hosts have failed 22736 1727204277.43625: getting the remaining hosts for this loop 22736 1727204277.43627: done getting the remaining hosts for this loop 22736 1727204277.43631: getting the next task for host managed-node2 22736 1727204277.43637: done getting next task for host managed-node2 22736 1727204277.43641: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22736 1727204277.43644: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204277.43661: getting variables 22736 1727204277.43663: in VariableManager get_vars() 22736 1727204277.43708: Calling all_inventory to load vars for managed-node2 22736 1727204277.43711: Calling groups_inventory to load vars for managed-node2 22736 1727204277.43714: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204277.43724: Calling all_plugins_play to load vars for managed-node2 22736 1727204277.43728: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204277.43731: Calling groups_plugins_play to load vars for managed-node2 22736 1727204277.45031: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204277.46603: done with get_vars() 22736 1727204277.46630: done getting variables 22736 1727204277.46682: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.044) 0:00:42.252 ***** 22736 1727204277.46711: entering _queue_task() for managed-node2/package 22736 1727204277.46994: worker is 1 (out of 1 available) 22736 1727204277.47009: exiting _queue_task() for managed-node2/package 22736 1727204277.47026: done queuing things up, now waiting for results queue to drain 22736 1727204277.47028: waiting for pending results... 22736 1727204277.47222: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 22736 1727204277.47306: in run() - task 12b410aa-8751-4f4a-548a-000000000066 22736 1727204277.47323: variable 'ansible_search_path' from source: unknown 22736 1727204277.47327: variable 'ansible_search_path' from source: unknown 22736 1727204277.47360: calling self._execute() 22736 1727204277.47440: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204277.47446: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204277.47456: variable 'omit' from source: magic vars 22736 1727204277.47776: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.47787: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204277.47893: variable 'network_state' from source: role '' defaults 22736 1727204277.47904: Evaluated conditional (network_state != {}): False 22736 1727204277.47907: when evaluation is False, skipping this task 22736 1727204277.47912: _execute() done 22736 1727204277.47914: dumping result to json 22736 1727204277.47930: done dumping result, returning 22736 1727204277.47933: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12b410aa-8751-4f4a-548a-000000000066] 22736 1727204277.47936: sending task result for task 12b410aa-8751-4f4a-548a-000000000066 22736 1727204277.48038: done sending task result for task 12b410aa-8751-4f4a-548a-000000000066 22736 1727204277.48041: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22736 1727204277.48099: no more pending results, returning what we have 22736 1727204277.48103: results queue empty 22736 1727204277.48105: checking for any_errors_fatal 22736 1727204277.48113: done checking for any_errors_fatal 22736 1727204277.48114: checking for max_fail_percentage 22736 1727204277.48116: done checking for max_fail_percentage 22736 1727204277.48119: checking to see if all hosts have failed and the running result is not ok 22736 1727204277.48120: done checking to see if all hosts have failed 22736 1727204277.48121: getting the remaining hosts for this loop 22736 1727204277.48123: done getting the remaining hosts for this loop 22736 1727204277.48128: getting the next task for host managed-node2 22736 1727204277.48133: done getting next task for host managed-node2 22736 1727204277.48137: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22736 1727204277.48139: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204277.48157: getting variables 22736 1727204277.48159: in VariableManager get_vars() 22736 1727204277.48198: Calling all_inventory to load vars for managed-node2 22736 1727204277.48201: Calling groups_inventory to load vars for managed-node2 22736 1727204277.48203: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204277.48215: Calling all_plugins_play to load vars for managed-node2 22736 1727204277.48220: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204277.48223: Calling groups_plugins_play to load vars for managed-node2 22736 1727204277.49442: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204277.51037: done with get_vars() 22736 1727204277.51064: done getting variables 22736 1727204277.51124: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.044) 0:00:42.296 ***** 22736 1727204277.51151: entering _queue_task() for managed-node2/service 22736 1727204277.51427: worker is 1 (out of 1 available) 22736 1727204277.51443: exiting _queue_task() for managed-node2/service 22736 1727204277.51458: done queuing things up, now waiting for results queue to drain 22736 1727204277.51460: waiting for pending results... 22736 1727204277.51654: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 22736 1727204277.51737: in run() - task 12b410aa-8751-4f4a-548a-000000000067 22736 1727204277.51750: variable 'ansible_search_path' from source: unknown 22736 1727204277.51753: variable 'ansible_search_path' from source: unknown 22736 1727204277.51788: calling self._execute() 22736 1727204277.51867: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204277.51875: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204277.51886: variable 'omit' from source: magic vars 22736 1727204277.52205: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.52216: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204277.52323: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204277.52495: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204277.54458: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204277.54524: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204277.54557: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204277.54586: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204277.54612: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204277.54683: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.54710: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.54733: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.54772: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.54784: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.54828: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.54847: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.54872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.54905: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.54919: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.54953: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.54976: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.55000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.55031: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.55043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.55189: variable 'network_connections' from source: play vars 22736 1727204277.55201: variable 'profile' from source: play vars 22736 1727204277.55264: variable 'profile' from source: play vars 22736 1727204277.55268: variable 'interface' from source: set_fact 22736 1727204277.55325: variable 'interface' from source: set_fact 22736 1727204277.55385: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204277.55522: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204277.55554: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204277.55591: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204277.55616: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204277.55659: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204277.55677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204277.55700: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.55723: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204277.55768: variable '__network_team_connections_defined' from source: role '' defaults 22736 1727204277.55971: variable 'network_connections' from source: play vars 22736 1727204277.55977: variable 'profile' from source: play vars 22736 1727204277.56029: variable 'profile' from source: play vars 22736 1727204277.56033: variable 'interface' from source: set_fact 22736 1727204277.56087: variable 'interface' from source: set_fact 22736 1727204277.56110: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 22736 1727204277.56113: when evaluation is False, skipping this task 22736 1727204277.56116: _execute() done 22736 1727204277.56121: dumping result to json 22736 1727204277.56124: done dumping result, returning 22736 1727204277.56132: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12b410aa-8751-4f4a-548a-000000000067] 22736 1727204277.56143: sending task result for task 12b410aa-8751-4f4a-548a-000000000067 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 22736 1727204277.56292: no more pending results, returning what we have 22736 1727204277.56295: results queue empty 22736 1727204277.56297: checking for any_errors_fatal 22736 1727204277.56305: done checking for any_errors_fatal 22736 1727204277.56306: checking for max_fail_percentage 22736 1727204277.56307: done checking for max_fail_percentage 22736 1727204277.56308: checking to see if all hosts have failed and the running result is not ok 22736 1727204277.56309: done checking to see if all hosts have failed 22736 1727204277.56310: getting the remaining hosts for this loop 22736 1727204277.56312: done getting the remaining hosts for this loop 22736 1727204277.56316: getting the next task for host managed-node2 22736 1727204277.56325: done getting next task for host managed-node2 22736 1727204277.56329: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22736 1727204277.56331: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204277.56347: getting variables 22736 1727204277.56350: in VariableManager get_vars() 22736 1727204277.56394: Calling all_inventory to load vars for managed-node2 22736 1727204277.56398: Calling groups_inventory to load vars for managed-node2 22736 1727204277.56401: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204277.56411: Calling all_plugins_play to load vars for managed-node2 22736 1727204277.56415: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204277.56420: Calling groups_plugins_play to load vars for managed-node2 22736 1727204277.57515: done sending task result for task 12b410aa-8751-4f4a-548a-000000000067 22736 1727204277.57519: WORKER PROCESS EXITING 22736 1727204277.57815: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204277.59386: done with get_vars() 22736 1727204277.59412: done getting variables 22736 1727204277.59466: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Tuesday 24 September 2024 14:57:57 -0400 (0:00:00.083) 0:00:42.379 ***** 22736 1727204277.59493: entering _queue_task() for managed-node2/service 22736 1727204277.59763: worker is 1 (out of 1 available) 22736 1727204277.59777: exiting _queue_task() for managed-node2/service 22736 1727204277.59792: done queuing things up, now waiting for results queue to drain 22736 1727204277.59794: waiting for pending results... 22736 1727204277.59986: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 22736 1727204277.60070: in run() - task 12b410aa-8751-4f4a-548a-000000000068 22736 1727204277.60082: variable 'ansible_search_path' from source: unknown 22736 1727204277.60086: variable 'ansible_search_path' from source: unknown 22736 1727204277.60123: calling self._execute() 22736 1727204277.60202: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204277.60210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204277.60223: variable 'omit' from source: magic vars 22736 1727204277.60537: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.60549: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204277.60688: variable 'network_provider' from source: set_fact 22736 1727204277.60696: variable 'network_state' from source: role '' defaults 22736 1727204277.60707: Evaluated conditional (network_provider == "nm" or network_state != {}): True 22736 1727204277.60714: variable 'omit' from source: magic vars 22736 1727204277.60752: variable 'omit' from source: magic vars 22736 1727204277.60777: variable 'network_service_name' from source: role '' defaults 22736 1727204277.60844: variable 'network_service_name' from source: role '' defaults 22736 1727204277.60934: variable '__network_provider_setup' from source: role '' defaults 22736 1727204277.60941: variable '__network_service_name_default_nm' from source: role '' defaults 22736 1727204277.60994: variable '__network_service_name_default_nm' from source: role '' defaults 22736 1727204277.61008: variable '__network_packages_default_nm' from source: role '' defaults 22736 1727204277.61059: variable '__network_packages_default_nm' from source: role '' defaults 22736 1727204277.61259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204277.62970: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204277.63036: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204277.63070: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204277.63106: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204277.63132: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204277.63207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.63235: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.63257: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.63290: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.63307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.63349: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.63369: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.63391: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.63428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.63441: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.63640: variable '__network_packages_default_gobject_packages' from source: role '' defaults 22736 1727204277.63739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.63762: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.63783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.63815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.63830: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.63908: variable 'ansible_python' from source: facts 22736 1727204277.63930: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 22736 1727204277.64002: variable '__network_wpa_supplicant_required' from source: role '' defaults 22736 1727204277.64069: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22736 1727204277.64177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.64203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.64227: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.64258: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.64270: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.64315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204277.64341: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204277.64362: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.64394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204277.64408: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204277.64522: variable 'network_connections' from source: play vars 22736 1727204277.64532: variable 'profile' from source: play vars 22736 1727204277.64593: variable 'profile' from source: play vars 22736 1727204277.64599: variable 'interface' from source: set_fact 22736 1727204277.64654: variable 'interface' from source: set_fact 22736 1727204277.64742: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204277.65018: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204277.65059: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204277.65103: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204277.65139: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204277.65196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204277.65224: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204277.65249: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204277.65279: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204277.65323: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204277.65550: variable 'network_connections' from source: play vars 22736 1727204277.65556: variable 'profile' from source: play vars 22736 1727204277.65623: variable 'profile' from source: play vars 22736 1727204277.65627: variable 'interface' from source: set_fact 22736 1727204277.65676: variable 'interface' from source: set_fact 22736 1727204277.65706: variable '__network_packages_default_wireless' from source: role '' defaults 22736 1727204277.65774: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204277.66015: variable 'network_connections' from source: play vars 22736 1727204277.66021: variable 'profile' from source: play vars 22736 1727204277.66081: variable 'profile' from source: play vars 22736 1727204277.66085: variable 'interface' from source: set_fact 22736 1727204277.66148: variable 'interface' from source: set_fact 22736 1727204277.66172: variable '__network_packages_default_team' from source: role '' defaults 22736 1727204277.66238: variable '__network_team_connections_defined' from source: role '' defaults 22736 1727204277.66474: variable 'network_connections' from source: play vars 22736 1727204277.66480: variable 'profile' from source: play vars 22736 1727204277.66542: variable 'profile' from source: play vars 22736 1727204277.66546: variable 'interface' from source: set_fact 22736 1727204277.66610: variable 'interface' from source: set_fact 22736 1727204277.66653: variable '__network_service_name_default_initscripts' from source: role '' defaults 22736 1727204277.66707: variable '__network_service_name_default_initscripts' from source: role '' defaults 22736 1727204277.66713: variable '__network_packages_default_initscripts' from source: role '' defaults 22736 1727204277.66765: variable '__network_packages_default_initscripts' from source: role '' defaults 22736 1727204277.66950: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 22736 1727204277.67356: variable 'network_connections' from source: play vars 22736 1727204277.67360: variable 'profile' from source: play vars 22736 1727204277.67414: variable 'profile' from source: play vars 22736 1727204277.67420: variable 'interface' from source: set_fact 22736 1727204277.67477: variable 'interface' from source: set_fact 22736 1727204277.67488: variable 'ansible_distribution' from source: facts 22736 1727204277.67494: variable '__network_rh_distros' from source: role '' defaults 22736 1727204277.67497: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.67644: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 22736 1727204277.67713: variable 'ansible_distribution' from source: facts 22736 1727204277.67722: variable '__network_rh_distros' from source: role '' defaults 22736 1727204277.67729: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.67732: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 22736 1727204277.67877: variable 'ansible_distribution' from source: facts 22736 1727204277.67883: variable '__network_rh_distros' from source: role '' defaults 22736 1727204277.67890: variable 'ansible_distribution_major_version' from source: facts 22736 1727204277.67919: variable 'network_provider' from source: set_fact 22736 1727204277.67944: variable 'omit' from source: magic vars 22736 1727204277.67969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204277.67994: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204277.68011: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204277.68029: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204277.68041: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204277.68071: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204277.68074: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204277.68079: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204277.68166: Set connection var ansible_timeout to 10 22736 1727204277.68177: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204277.68186: Set connection var ansible_shell_executable to /bin/sh 22736 1727204277.68190: Set connection var ansible_shell_type to sh 22736 1727204277.68197: Set connection var ansible_pipelining to False 22736 1727204277.68199: Set connection var ansible_connection to ssh 22736 1727204277.68225: variable 'ansible_shell_executable' from source: unknown 22736 1727204277.68228: variable 'ansible_connection' from source: unknown 22736 1727204277.68231: variable 'ansible_module_compression' from source: unknown 22736 1727204277.68234: variable 'ansible_shell_type' from source: unknown 22736 1727204277.68237: variable 'ansible_shell_executable' from source: unknown 22736 1727204277.68242: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204277.68248: variable 'ansible_pipelining' from source: unknown 22736 1727204277.68252: variable 'ansible_timeout' from source: unknown 22736 1727204277.68254: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204277.68346: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204277.68357: variable 'omit' from source: magic vars 22736 1727204277.68363: starting attempt loop 22736 1727204277.68368: running the handler 22736 1727204277.68439: variable 'ansible_facts' from source: unknown 22736 1727204277.69138: _low_level_execute_command(): starting 22736 1727204277.69145: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204277.69702: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204277.69706: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204277.69709: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204277.69722: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204277.69741: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204277.69784: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204277.69829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204277.69894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204277.71683: stdout chunk (state=3): >>>/root <<< 22736 1727204277.71791: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204277.71845: stderr chunk (state=3): >>><<< 22736 1727204277.71848: stdout chunk (state=3): >>><<< 22736 1727204277.71866: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204277.71878: _low_level_execute_command(): starting 22736 1727204277.71884: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204277.718669-24979-140849618467904 `" && echo ansible-tmp-1727204277.718669-24979-140849618467904="` echo /root/.ansible/tmp/ansible-tmp-1727204277.718669-24979-140849618467904 `" ) && sleep 0' 22736 1727204277.72339: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204277.72342: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204277.72345: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204277.72347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204277.72403: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204277.72406: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204277.72456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204277.74570: stdout chunk (state=3): >>>ansible-tmp-1727204277.718669-24979-140849618467904=/root/.ansible/tmp/ansible-tmp-1727204277.718669-24979-140849618467904 <<< 22736 1727204277.74997: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204277.75001: stdout chunk (state=3): >>><<< 22736 1727204277.75003: stderr chunk (state=3): >>><<< 22736 1727204277.75006: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204277.718669-24979-140849618467904=/root/.ansible/tmp/ansible-tmp-1727204277.718669-24979-140849618467904 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204277.75009: variable 'ansible_module_compression' from source: unknown 22736 1727204277.75011: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 22736 1727204277.75013: variable 'ansible_facts' from source: unknown 22736 1727204277.75224: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204277.718669-24979-140849618467904/AnsiballZ_systemd.py 22736 1727204277.75483: Sending initial data 22736 1727204277.75486: Sent initial data (155 bytes) 22736 1727204277.76210: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204277.76215: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204277.76260: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204277.76276: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204277.76325: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204277.76371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204277.78146: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204277.78201: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204277.78249: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmprilesfie /root/.ansible/tmp/ansible-tmp-1727204277.718669-24979-140849618467904/AnsiballZ_systemd.py <<< 22736 1727204277.78253: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204277.718669-24979-140849618467904/AnsiballZ_systemd.py" <<< 22736 1727204277.78284: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmprilesfie" to remote "/root/.ansible/tmp/ansible-tmp-1727204277.718669-24979-140849618467904/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204277.718669-24979-140849618467904/AnsiballZ_systemd.py" <<< 22736 1727204277.80798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204277.80948: stderr chunk (state=3): >>><<< 22736 1727204277.80951: stdout chunk (state=3): >>><<< 22736 1727204277.80954: done transferring module to remote 22736 1727204277.80956: _low_level_execute_command(): starting 22736 1727204277.80959: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204277.718669-24979-140849618467904/ /root/.ansible/tmp/ansible-tmp-1727204277.718669-24979-140849618467904/AnsiballZ_systemd.py && sleep 0' 22736 1727204277.81607: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204277.81659: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204277.81682: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204277.81709: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204277.81778: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204277.83896: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204277.83899: stdout chunk (state=3): >>><<< 22736 1727204277.83902: stderr chunk (state=3): >>><<< 22736 1727204277.83904: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204277.83906: _low_level_execute_command(): starting 22736 1727204277.83908: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204277.718669-24979-140849618467904/AnsiballZ_systemd.py && sleep 0' 22736 1727204277.84552: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204277.84597: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204277.84618: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204277.84696: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204277.84743: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204277.84769: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204277.84825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204277.84883: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204278.18358: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4407296", "MemoryAvailable": "infinity", "CPUUsageNSec": "1337729000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": <<< 22736 1727204278.18370: stdout chunk (state=3): >>>"loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 22736 1727204278.20391: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204278.20404: stdout chunk (state=3): >>><<< 22736 1727204278.20417: stderr chunk (state=3): >>><<< 22736 1727204278.20598: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "45s", "TimeoutAbortUSec": "45s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "abort", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "3356", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ExecMainStartTimestampMonotonic": "406531145", "ExecMainExitTimestampMonotonic": "0", "ExecMainPID": "3356", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "5133", "MemoryCurrent": "4407296", "MemoryAvailable": "infinity", "CPUUsageNSec": "1337729000", "TasksCurrent": "4", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "18446744073709551615", "IOReadOperations": "18446744073709551615", "IOWriteBytes": "18446744073709551615", "IOWriteOperations": "18446744073709551615", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "18446744073709551615", "MemoryMax": "infinity", "StartupMemoryMax": "18446744073709551615", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "18446744073709551615", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "18446744073709551615", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "4425", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "14752", "LimitNPROCSoft": "14752", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "14752", "LimitSIGPENDINGSoft": "14752", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket sysinit.target system.slice", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "NetworkManager-wait-online.service shutdown.target network.service network.target multi-user.target cloud-init.service", "After": "dbus.socket basic.target system.slice cloud-init-local.service dbus-broker.service network-pre.target sysinit.target systemd-journald.socket", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "DropInPaths": "/usr/lib/systemd/system/service.d/10-timeout-abort.conf", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Tue 2024-09-24 14:54:05 EDT", "StateChangeTimestampMonotonic": "549790843", "InactiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveExitTimestampMonotonic": "406531448", "ActiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveEnterTimestampMonotonic": "406627687", "ActiveExitTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ActiveExitTimestampMonotonic": "406493130", "InactiveEnterTimestamp": "Tue 2024-09-24 14:51:42 EDT", "InactiveEnterTimestampMonotonic": "406526352", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Tue 2024-09-24 14:51:42 EDT", "ConditionTimestampMonotonic": "406527163", "AssertTimestamp": "Tue 2024-09-24 14:51:42 EDT", "AssertTimestampMonotonic": "406527166", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "876a1c99afe7488d8feb64cca47a5183", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204278.20747: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204277.718669-24979-140849618467904/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204278.20777: _low_level_execute_command(): starting 22736 1727204278.20788: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204277.718669-24979-140849618467904/ > /dev/null 2>&1 && sleep 0' 22736 1727204278.21420: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204278.21434: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204278.21450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204278.21562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master <<< 22736 1727204278.21584: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204278.21608: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204278.21686: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204278.23780: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204278.23795: stdout chunk (state=3): >>><<< 22736 1727204278.23809: stderr chunk (state=3): >>><<< 22736 1727204278.23834: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204278.23849: handler run complete 22736 1727204278.23957: attempt loop complete, returning result 22736 1727204278.23967: _execute() done 22736 1727204278.23975: dumping result to json 22736 1727204278.24012: done dumping result, returning 22736 1727204278.24034: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12b410aa-8751-4f4a-548a-000000000068] 22736 1727204278.24044: sending task result for task 12b410aa-8751-4f4a-548a-000000000068 ok: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22736 1727204278.24507: no more pending results, returning what we have 22736 1727204278.24511: results queue empty 22736 1727204278.24513: checking for any_errors_fatal 22736 1727204278.24520: done checking for any_errors_fatal 22736 1727204278.24521: checking for max_fail_percentage 22736 1727204278.24523: done checking for max_fail_percentage 22736 1727204278.24524: checking to see if all hosts have failed and the running result is not ok 22736 1727204278.24525: done checking to see if all hosts have failed 22736 1727204278.24526: getting the remaining hosts for this loop 22736 1727204278.24528: done getting the remaining hosts for this loop 22736 1727204278.24533: getting the next task for host managed-node2 22736 1727204278.24541: done getting next task for host managed-node2 22736 1727204278.24545: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22736 1727204278.24548: ^ state is: HOST STATE: block=2, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204278.24564: getting variables 22736 1727204278.24567: in VariableManager get_vars() 22736 1727204278.24611: Calling all_inventory to load vars for managed-node2 22736 1727204278.24615: Calling groups_inventory to load vars for managed-node2 22736 1727204278.24618: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204278.24631: Calling all_plugins_play to load vars for managed-node2 22736 1727204278.24635: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204278.24639: Calling groups_plugins_play to load vars for managed-node2 22736 1727204278.25421: done sending task result for task 12b410aa-8751-4f4a-548a-000000000068 22736 1727204278.25425: WORKER PROCESS EXITING 22736 1727204278.27506: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204278.30672: done with get_vars() 22736 1727204278.30727: done getting variables 22736 1727204278.30808: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Tuesday 24 September 2024 14:57:58 -0400 (0:00:00.713) 0:00:43.093 ***** 22736 1727204278.30845: entering _queue_task() for managed-node2/service 22736 1727204278.31275: worker is 1 (out of 1 available) 22736 1727204278.31343: exiting _queue_task() for managed-node2/service 22736 1727204278.31357: done queuing things up, now waiting for results queue to drain 22736 1727204278.31359: waiting for pending results... 22736 1727204278.31612: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 22736 1727204278.31754: in run() - task 12b410aa-8751-4f4a-548a-000000000069 22736 1727204278.31897: variable 'ansible_search_path' from source: unknown 22736 1727204278.31901: variable 'ansible_search_path' from source: unknown 22736 1727204278.31905: calling self._execute() 22736 1727204278.31966: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204278.31982: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204278.32013: variable 'omit' from source: magic vars 22736 1727204278.32477: variable 'ansible_distribution_major_version' from source: facts 22736 1727204278.32499: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204278.32672: variable 'network_provider' from source: set_fact 22736 1727204278.32685: Evaluated conditional (network_provider == "nm"): True 22736 1727204278.32819: variable '__network_wpa_supplicant_required' from source: role '' defaults 22736 1727204278.32943: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 22736 1727204278.33186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204278.35470: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204278.35535: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204278.35567: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204278.35600: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204278.35630: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204278.35712: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204278.35742: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204278.35764: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204278.35798: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204278.35812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204278.35859: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204278.35879: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204278.35901: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204278.35938: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204278.35951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204278.35988: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204278.36009: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204278.36035: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204278.36066: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204278.36080: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204278.36207: variable 'network_connections' from source: play vars 22736 1727204278.36219: variable 'profile' from source: play vars 22736 1727204278.36294: variable 'profile' from source: play vars 22736 1727204278.36304: variable 'interface' from source: set_fact 22736 1727204278.36358: variable 'interface' from source: set_fact 22736 1727204278.36466: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 22736 1727204278.36695: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 22736 1727204278.36724: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 22736 1727204278.36764: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 22736 1727204278.36879: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 22736 1727204278.36883: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 22736 1727204278.36886: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 22736 1727204278.36906: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204278.36945: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 22736 1727204278.37009: variable '__network_wireless_connections_defined' from source: role '' defaults 22736 1727204278.37357: variable 'network_connections' from source: play vars 22736 1727204278.37364: variable 'profile' from source: play vars 22736 1727204278.37450: variable 'profile' from source: play vars 22736 1727204278.37453: variable 'interface' from source: set_fact 22736 1727204278.37556: variable 'interface' from source: set_fact 22736 1727204278.37568: Evaluated conditional (__network_wpa_supplicant_required): False 22736 1727204278.37572: when evaluation is False, skipping this task 22736 1727204278.37578: _execute() done 22736 1727204278.37590: dumping result to json 22736 1727204278.37593: done dumping result, returning 22736 1727204278.37596: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12b410aa-8751-4f4a-548a-000000000069] 22736 1727204278.37602: sending task result for task 12b410aa-8751-4f4a-548a-000000000069 skipping: [managed-node2] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 22736 1727204278.37838: no more pending results, returning what we have 22736 1727204278.37842: results queue empty 22736 1727204278.37843: checking for any_errors_fatal 22736 1727204278.37871: done checking for any_errors_fatal 22736 1727204278.37872: checking for max_fail_percentage 22736 1727204278.37876: done checking for max_fail_percentage 22736 1727204278.37877: checking to see if all hosts have failed and the running result is not ok 22736 1727204278.37878: done checking to see if all hosts have failed 22736 1727204278.37879: getting the remaining hosts for this loop 22736 1727204278.37881: done getting the remaining hosts for this loop 22736 1727204278.37885: getting the next task for host managed-node2 22736 1727204278.38008: done getting next task for host managed-node2 22736 1727204278.38036: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 22736 1727204278.38039: ^ state is: HOST STATE: block=2, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204278.38055: getting variables 22736 1727204278.38057: in VariableManager get_vars() 22736 1727204278.38122: Calling all_inventory to load vars for managed-node2 22736 1727204278.38126: Calling groups_inventory to load vars for managed-node2 22736 1727204278.38129: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204278.38135: done sending task result for task 12b410aa-8751-4f4a-548a-000000000069 22736 1727204278.38138: WORKER PROCESS EXITING 22736 1727204278.38148: Calling all_plugins_play to load vars for managed-node2 22736 1727204278.38152: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204278.38155: Calling groups_plugins_play to load vars for managed-node2 22736 1727204278.39753: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204278.41876: done with get_vars() 22736 1727204278.41918: done getting variables 22736 1727204278.41992: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Tuesday 24 September 2024 14:57:58 -0400 (0:00:00.111) 0:00:43.205 ***** 22736 1727204278.42021: entering _queue_task() for managed-node2/service 22736 1727204278.42298: worker is 1 (out of 1 available) 22736 1727204278.42312: exiting _queue_task() for managed-node2/service 22736 1727204278.42327: done queuing things up, now waiting for results queue to drain 22736 1727204278.42329: waiting for pending results... 22736 1727204278.42531: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service 22736 1727204278.42620: in run() - task 12b410aa-8751-4f4a-548a-00000000006a 22736 1727204278.42634: variable 'ansible_search_path' from source: unknown 22736 1727204278.42638: variable 'ansible_search_path' from source: unknown 22736 1727204278.42675: calling self._execute() 22736 1727204278.42759: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204278.42766: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204278.42778: variable 'omit' from source: magic vars 22736 1727204278.43103: variable 'ansible_distribution_major_version' from source: facts 22736 1727204278.43120: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204278.43221: variable 'network_provider' from source: set_fact 22736 1727204278.43234: Evaluated conditional (network_provider == "initscripts"): False 22736 1727204278.43237: when evaluation is False, skipping this task 22736 1727204278.43240: _execute() done 22736 1727204278.43243: dumping result to json 22736 1727204278.43245: done dumping result, returning 22736 1727204278.43254: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Enable network service [12b410aa-8751-4f4a-548a-00000000006a] 22736 1727204278.43259: sending task result for task 12b410aa-8751-4f4a-548a-00000000006a 22736 1727204278.43354: done sending task result for task 12b410aa-8751-4f4a-548a-00000000006a 22736 1727204278.43358: WORKER PROCESS EXITING skipping: [managed-node2] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 22736 1727204278.43408: no more pending results, returning what we have 22736 1727204278.43411: results queue empty 22736 1727204278.43413: checking for any_errors_fatal 22736 1727204278.43424: done checking for any_errors_fatal 22736 1727204278.43425: checking for max_fail_percentage 22736 1727204278.43428: done checking for max_fail_percentage 22736 1727204278.43429: checking to see if all hosts have failed and the running result is not ok 22736 1727204278.43430: done checking to see if all hosts have failed 22736 1727204278.43430: getting the remaining hosts for this loop 22736 1727204278.43432: done getting the remaining hosts for this loop 22736 1727204278.43436: getting the next task for host managed-node2 22736 1727204278.43444: done getting next task for host managed-node2 22736 1727204278.43448: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22736 1727204278.43451: ^ state is: HOST STATE: block=2, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204278.43470: getting variables 22736 1727204278.43472: in VariableManager get_vars() 22736 1727204278.43518: Calling all_inventory to load vars for managed-node2 22736 1727204278.43521: Calling groups_inventory to load vars for managed-node2 22736 1727204278.43524: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204278.43536: Calling all_plugins_play to load vars for managed-node2 22736 1727204278.43539: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204278.43543: Calling groups_plugins_play to load vars for managed-node2 22736 1727204278.44760: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204278.46331: done with get_vars() 22736 1727204278.46355: done getting variables 22736 1727204278.46406: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Tuesday 24 September 2024 14:57:58 -0400 (0:00:00.044) 0:00:43.249 ***** 22736 1727204278.46432: entering _queue_task() for managed-node2/copy 22736 1727204278.46683: worker is 1 (out of 1 available) 22736 1727204278.46697: exiting _queue_task() for managed-node2/copy 22736 1727204278.46714: done queuing things up, now waiting for results queue to drain 22736 1727204278.46715: waiting for pending results... 22736 1727204278.46917: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 22736 1727204278.46999: in run() - task 12b410aa-8751-4f4a-548a-00000000006b 22736 1727204278.47012: variable 'ansible_search_path' from source: unknown 22736 1727204278.47017: variable 'ansible_search_path' from source: unknown 22736 1727204278.47055: calling self._execute() 22736 1727204278.47137: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204278.47143: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204278.47159: variable 'omit' from source: magic vars 22736 1727204278.47477: variable 'ansible_distribution_major_version' from source: facts 22736 1727204278.47487: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204278.47593: variable 'network_provider' from source: set_fact 22736 1727204278.47600: Evaluated conditional (network_provider == "initscripts"): False 22736 1727204278.47603: when evaluation is False, skipping this task 22736 1727204278.47606: _execute() done 22736 1727204278.47611: dumping result to json 22736 1727204278.47615: done dumping result, returning 22736 1727204278.47627: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12b410aa-8751-4f4a-548a-00000000006b] 22736 1727204278.47630: sending task result for task 12b410aa-8751-4f4a-548a-00000000006b skipping: [managed-node2] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 22736 1727204278.47784: no more pending results, returning what we have 22736 1727204278.47790: results queue empty 22736 1727204278.47792: checking for any_errors_fatal 22736 1727204278.47798: done checking for any_errors_fatal 22736 1727204278.47799: checking for max_fail_percentage 22736 1727204278.47801: done checking for max_fail_percentage 22736 1727204278.47802: checking to see if all hosts have failed and the running result is not ok 22736 1727204278.47803: done checking to see if all hosts have failed 22736 1727204278.47804: getting the remaining hosts for this loop 22736 1727204278.47805: done getting the remaining hosts for this loop 22736 1727204278.47811: getting the next task for host managed-node2 22736 1727204278.47817: done getting next task for host managed-node2 22736 1727204278.47821: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22736 1727204278.47823: ^ state is: HOST STATE: block=2, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204278.47838: getting variables 22736 1727204278.47840: in VariableManager get_vars() 22736 1727204278.47874: Calling all_inventory to load vars for managed-node2 22736 1727204278.47877: Calling groups_inventory to load vars for managed-node2 22736 1727204278.47879: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204278.47897: Calling all_plugins_play to load vars for managed-node2 22736 1727204278.47901: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204278.47907: done sending task result for task 12b410aa-8751-4f4a-548a-00000000006b 22736 1727204278.47910: WORKER PROCESS EXITING 22736 1727204278.47914: Calling groups_plugins_play to load vars for managed-node2 22736 1727204278.49225: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204278.50777: done with get_vars() 22736 1727204278.50801: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Tuesday 24 September 2024 14:57:58 -0400 (0:00:00.044) 0:00:43.293 ***** 22736 1727204278.50872: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 22736 1727204278.51130: worker is 1 (out of 1 available) 22736 1727204278.51146: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_connections 22736 1727204278.51162: done queuing things up, now waiting for results queue to drain 22736 1727204278.51164: waiting for pending results... 22736 1727204278.51358: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 22736 1727204278.51443: in run() - task 12b410aa-8751-4f4a-548a-00000000006c 22736 1727204278.51457: variable 'ansible_search_path' from source: unknown 22736 1727204278.51460: variable 'ansible_search_path' from source: unknown 22736 1727204278.51496: calling self._execute() 22736 1727204278.51575: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204278.51581: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204278.51593: variable 'omit' from source: magic vars 22736 1727204278.51918: variable 'ansible_distribution_major_version' from source: facts 22736 1727204278.51932: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204278.51945: variable 'omit' from source: magic vars 22736 1727204278.51978: variable 'omit' from source: magic vars 22736 1727204278.52122: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 22736 1727204278.53857: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 22736 1727204278.53919: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 22736 1727204278.53954: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 22736 1727204278.53985: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 22736 1727204278.54014: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 22736 1727204278.54083: variable 'network_provider' from source: set_fact 22736 1727204278.54204: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 22736 1727204278.54247: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 22736 1727204278.54268: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 22736 1727204278.54302: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 22736 1727204278.54315: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 22736 1727204278.54386: variable 'omit' from source: magic vars 22736 1727204278.54486: variable 'omit' from source: magic vars 22736 1727204278.54578: variable 'network_connections' from source: play vars 22736 1727204278.54590: variable 'profile' from source: play vars 22736 1727204278.54652: variable 'profile' from source: play vars 22736 1727204278.54658: variable 'interface' from source: set_fact 22736 1727204278.54711: variable 'interface' from source: set_fact 22736 1727204278.54837: variable 'omit' from source: magic vars 22736 1727204278.54846: variable '__lsr_ansible_managed' from source: task vars 22736 1727204278.54903: variable '__lsr_ansible_managed' from source: task vars 22736 1727204278.55145: Loaded config def from plugin (lookup/template) 22736 1727204278.55150: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 22736 1727204278.55177: File lookup term: get_ansible_managed.j2 22736 1727204278.55180: variable 'ansible_search_path' from source: unknown 22736 1727204278.55186: evaluation_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 22736 1727204278.55204: search_path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 22736 1727204278.55220: variable 'ansible_search_path' from source: unknown 22736 1727204278.60719: variable 'ansible_managed' from source: unknown 22736 1727204278.60864: variable 'omit' from source: magic vars 22736 1727204278.60891: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204278.60915: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204278.60932: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204278.60950: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204278.60961: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204278.60985: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204278.60991: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204278.60995: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204278.61076: Set connection var ansible_timeout to 10 22736 1727204278.61086: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204278.61097: Set connection var ansible_shell_executable to /bin/sh 22736 1727204278.61100: Set connection var ansible_shell_type to sh 22736 1727204278.61112: Set connection var ansible_pipelining to False 22736 1727204278.61114: Set connection var ansible_connection to ssh 22736 1727204278.61150: variable 'ansible_shell_executable' from source: unknown 22736 1727204278.61153: variable 'ansible_connection' from source: unknown 22736 1727204278.61157: variable 'ansible_module_compression' from source: unknown 22736 1727204278.61160: variable 'ansible_shell_type' from source: unknown 22736 1727204278.61163: variable 'ansible_shell_executable' from source: unknown 22736 1727204278.61165: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204278.61230: variable 'ansible_pipelining' from source: unknown 22736 1727204278.61233: variable 'ansible_timeout' from source: unknown 22736 1727204278.61236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204278.61350: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204278.61362: variable 'omit' from source: magic vars 22736 1727204278.61365: starting attempt loop 22736 1727204278.61368: running the handler 22736 1727204278.61370: _low_level_execute_command(): starting 22736 1727204278.61372: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204278.62096: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204278.62100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204278.62103: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204278.62105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204278.62126: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204278.62130: stderr chunk (state=3): >>>debug2: match not found <<< 22736 1727204278.62132: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204278.62148: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22736 1727204278.62151: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 22736 1727204278.62161: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 22736 1727204278.62170: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204278.62181: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204278.62208: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204278.62214: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204278.62223: stderr chunk (state=3): >>>debug2: match found <<< 22736 1727204278.62234: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204278.62308: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204278.62323: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204278.62405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204278.62453: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204278.64238: stdout chunk (state=3): >>>/root <<< 22736 1727204278.64348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204278.64422: stderr chunk (state=3): >>><<< 22736 1727204278.64426: stdout chunk (state=3): >>><<< 22736 1727204278.64550: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204278.64554: _low_level_execute_command(): starting 22736 1727204278.64557: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204278.6444814-25009-28185830603157 `" && echo ansible-tmp-1727204278.6444814-25009-28185830603157="` echo /root/.ansible/tmp/ansible-tmp-1727204278.6444814-25009-28185830603157 `" ) && sleep 0' 22736 1727204278.65098: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204278.65112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204278.65134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204278.65153: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204278.65169: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204278.65250: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204278.65300: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204278.65316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204278.65358: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204278.65482: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204278.67530: stdout chunk (state=3): >>>ansible-tmp-1727204278.6444814-25009-28185830603157=/root/.ansible/tmp/ansible-tmp-1727204278.6444814-25009-28185830603157 <<< 22736 1727204278.67800: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204278.67803: stdout chunk (state=3): >>><<< 22736 1727204278.67806: stderr chunk (state=3): >>><<< 22736 1727204278.67808: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204278.6444814-25009-28185830603157=/root/.ansible/tmp/ansible-tmp-1727204278.6444814-25009-28185830603157 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204278.67827: variable 'ansible_module_compression' from source: unknown 22736 1727204278.67881: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 22736 1727204278.67956: variable 'ansible_facts' from source: unknown 22736 1727204278.68104: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204278.6444814-25009-28185830603157/AnsiballZ_network_connections.py 22736 1727204278.68282: Sending initial data 22736 1727204278.68286: Sent initial data (167 bytes) 22736 1727204278.69000: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204278.69015: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204278.69102: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204278.69163: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204278.69208: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204278.70924: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204278.70954: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204278.71194: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpd4u5ufy5 /root/.ansible/tmp/ansible-tmp-1727204278.6444814-25009-28185830603157/AnsiballZ_network_connections.py <<< 22736 1727204278.71197: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204278.6444814-25009-28185830603157/AnsiballZ_network_connections.py" <<< 22736 1727204278.71225: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpd4u5ufy5" to remote "/root/.ansible/tmp/ansible-tmp-1727204278.6444814-25009-28185830603157/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204278.6444814-25009-28185830603157/AnsiballZ_network_connections.py" <<< 22736 1727204278.73034: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204278.73142: stderr chunk (state=3): >>><<< 22736 1727204278.73152: stdout chunk (state=3): >>><<< 22736 1727204278.73183: done transferring module to remote 22736 1727204278.73209: _low_level_execute_command(): starting 22736 1727204278.73222: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204278.6444814-25009-28185830603157/ /root/.ansible/tmp/ansible-tmp-1727204278.6444814-25009-28185830603157/AnsiballZ_network_connections.py && sleep 0' 22736 1727204278.73872: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204278.73885: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204278.73903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204278.73925: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204278.73955: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204278.74065: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204278.74084: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204278.74153: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204278.76180: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204278.76194: stdout chunk (state=3): >>><<< 22736 1727204278.76208: stderr chunk (state=3): >>><<< 22736 1727204278.76235: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204278.76245: _low_level_execute_command(): starting 22736 1727204278.76256: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204278.6444814-25009-28185830603157/AnsiballZ_network_connections.py && sleep 0' 22736 1727204278.76878: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204278.76898: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204278.76915: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204278.76938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204278.76961: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204278.77009: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204278.77084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204278.77108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204278.77134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204278.77217: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204279.07791: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_irconxx1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 22736 1727204279.07821: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_irconxx1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/752bdb29-49cb-43fb-bb8f-6bafcdca1322: error=unknown <<< 22736 1727204279.08013: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 22736 1727204279.10115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204279.10129: stdout chunk (state=3): >>><<< 22736 1727204279.10151: stderr chunk (state=3): >>><<< 22736 1727204279.10295: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_irconxx1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_irconxx1/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on lsr27/752bdb29-49cb-43fb-bb8f-6bafcdca1322: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "lsr27", "persistent_state": "absent"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204279.10298: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'lsr27', 'persistent_state': 'absent'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204278.6444814-25009-28185830603157/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204279.10301: _low_level_execute_command(): starting 22736 1727204279.10304: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204278.6444814-25009-28185830603157/ > /dev/null 2>&1 && sleep 0' 22736 1727204279.10908: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204279.10924: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204279.10940: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204279.10961: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204279.11080: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204279.11105: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204279.11172: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204279.13170: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204279.13253: stderr chunk (state=3): >>><<< 22736 1727204279.13263: stdout chunk (state=3): >>><<< 22736 1727204279.13284: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204279.13495: handler run complete 22736 1727204279.13498: attempt loop complete, returning result 22736 1727204279.13501: _execute() done 22736 1727204279.13504: dumping result to json 22736 1727204279.13506: done dumping result, returning 22736 1727204279.13509: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12b410aa-8751-4f4a-548a-00000000006c] 22736 1727204279.13511: sending task result for task 12b410aa-8751-4f4a-548a-00000000006c 22736 1727204279.13595: done sending task result for task 12b410aa-8751-4f4a-548a-00000000006c 22736 1727204279.13599: WORKER PROCESS EXITING changed: [managed-node2] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 22736 1727204279.13737: no more pending results, returning what we have 22736 1727204279.13742: results queue empty 22736 1727204279.13743: checking for any_errors_fatal 22736 1727204279.13752: done checking for any_errors_fatal 22736 1727204279.13753: checking for max_fail_percentage 22736 1727204279.13755: done checking for max_fail_percentage 22736 1727204279.13756: checking to see if all hosts have failed and the running result is not ok 22736 1727204279.13757: done checking to see if all hosts have failed 22736 1727204279.13758: getting the remaining hosts for this loop 22736 1727204279.13760: done getting the remaining hosts for this loop 22736 1727204279.13765: getting the next task for host managed-node2 22736 1727204279.13772: done getting next task for host managed-node2 22736 1727204279.13777: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 22736 1727204279.13779: ^ state is: HOST STATE: block=2, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204279.13827: getting variables 22736 1727204279.13830: in VariableManager get_vars() 22736 1727204279.13875: Calling all_inventory to load vars for managed-node2 22736 1727204279.13878: Calling groups_inventory to load vars for managed-node2 22736 1727204279.13881: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204279.13898: Calling all_plugins_play to load vars for managed-node2 22736 1727204279.13902: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204279.14103: Calling groups_plugins_play to load vars for managed-node2 22736 1727204279.16072: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204279.18149: done with get_vars() 22736 1727204279.18182: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Tuesday 24 September 2024 14:57:59 -0400 (0:00:00.674) 0:00:43.967 ***** 22736 1727204279.18293: entering _queue_task() for managed-node2/fedora.linux_system_roles.network_state 22736 1727204279.18604: worker is 1 (out of 1 available) 22736 1727204279.18623: exiting _queue_task() for managed-node2/fedora.linux_system_roles.network_state 22736 1727204279.18637: done queuing things up, now waiting for results queue to drain 22736 1727204279.18639: waiting for pending results... 22736 1727204279.18832: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state 22736 1727204279.18913: in run() - task 12b410aa-8751-4f4a-548a-00000000006d 22736 1727204279.18931: variable 'ansible_search_path' from source: unknown 22736 1727204279.18934: variable 'ansible_search_path' from source: unknown 22736 1727204279.18967: calling self._execute() 22736 1727204279.19047: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204279.19053: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204279.19065: variable 'omit' from source: magic vars 22736 1727204279.19396: variable 'ansible_distribution_major_version' from source: facts 22736 1727204279.19408: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204279.19521: variable 'network_state' from source: role '' defaults 22736 1727204279.19529: Evaluated conditional (network_state != {}): False 22736 1727204279.19533: when evaluation is False, skipping this task 22736 1727204279.19538: _execute() done 22736 1727204279.19541: dumping result to json 22736 1727204279.19546: done dumping result, returning 22736 1727204279.19554: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Configure networking state [12b410aa-8751-4f4a-548a-00000000006d] 22736 1727204279.19559: sending task result for task 12b410aa-8751-4f4a-548a-00000000006d 22736 1727204279.19657: done sending task result for task 12b410aa-8751-4f4a-548a-00000000006d 22736 1727204279.19660: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 22736 1727204279.19724: no more pending results, returning what we have 22736 1727204279.19728: results queue empty 22736 1727204279.19729: checking for any_errors_fatal 22736 1727204279.19736: done checking for any_errors_fatal 22736 1727204279.19736: checking for max_fail_percentage 22736 1727204279.19738: done checking for max_fail_percentage 22736 1727204279.19739: checking to see if all hosts have failed and the running result is not ok 22736 1727204279.19740: done checking to see if all hosts have failed 22736 1727204279.19741: getting the remaining hosts for this loop 22736 1727204279.19743: done getting the remaining hosts for this loop 22736 1727204279.19747: getting the next task for host managed-node2 22736 1727204279.19753: done getting next task for host managed-node2 22736 1727204279.19757: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22736 1727204279.19759: ^ state is: HOST STATE: block=2, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204279.19774: getting variables 22736 1727204279.19776: in VariableManager get_vars() 22736 1727204279.19811: Calling all_inventory to load vars for managed-node2 22736 1727204279.19814: Calling groups_inventory to load vars for managed-node2 22736 1727204279.19816: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204279.19828: Calling all_plugins_play to load vars for managed-node2 22736 1727204279.19832: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204279.19835: Calling groups_plugins_play to load vars for managed-node2 22736 1727204279.23186: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204279.26079: done with get_vars() 22736 1727204279.26134: done getting variables 22736 1727204279.26206: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Tuesday 24 September 2024 14:57:59 -0400 (0:00:00.079) 0:00:44.047 ***** 22736 1727204279.26249: entering _queue_task() for managed-node2/debug 22736 1727204279.26616: worker is 1 (out of 1 available) 22736 1727204279.26633: exiting _queue_task() for managed-node2/debug 22736 1727204279.26645: done queuing things up, now waiting for results queue to drain 22736 1727204279.26647: waiting for pending results... 22736 1727204279.27024: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 22736 1727204279.27069: in run() - task 12b410aa-8751-4f4a-548a-00000000006e 22736 1727204279.27086: variable 'ansible_search_path' from source: unknown 22736 1727204279.27093: variable 'ansible_search_path' from source: unknown 22736 1727204279.27227: calling self._execute() 22736 1727204279.27246: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204279.27255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204279.27347: variable 'omit' from source: magic vars 22736 1727204279.27733: variable 'ansible_distribution_major_version' from source: facts 22736 1727204279.27744: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204279.27752: variable 'omit' from source: magic vars 22736 1727204279.27794: variable 'omit' from source: magic vars 22736 1727204279.27828: variable 'omit' from source: magic vars 22736 1727204279.27863: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204279.27899: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204279.27918: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204279.27937: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204279.27948: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204279.27978: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204279.27982: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204279.27986: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204279.28074: Set connection var ansible_timeout to 10 22736 1727204279.28085: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204279.28095: Set connection var ansible_shell_executable to /bin/sh 22736 1727204279.28097: Set connection var ansible_shell_type to sh 22736 1727204279.28110: Set connection var ansible_pipelining to False 22736 1727204279.28113: Set connection var ansible_connection to ssh 22736 1727204279.28133: variable 'ansible_shell_executable' from source: unknown 22736 1727204279.28136: variable 'ansible_connection' from source: unknown 22736 1727204279.28139: variable 'ansible_module_compression' from source: unknown 22736 1727204279.28143: variable 'ansible_shell_type' from source: unknown 22736 1727204279.28145: variable 'ansible_shell_executable' from source: unknown 22736 1727204279.28150: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204279.28156: variable 'ansible_pipelining' from source: unknown 22736 1727204279.28160: variable 'ansible_timeout' from source: unknown 22736 1727204279.28165: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204279.28288: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204279.28301: variable 'omit' from source: magic vars 22736 1727204279.28305: starting attempt loop 22736 1727204279.28311: running the handler 22736 1727204279.28439: variable '__network_connections_result' from source: set_fact 22736 1727204279.28476: handler run complete 22736 1727204279.28494: attempt loop complete, returning result 22736 1727204279.28497: _execute() done 22736 1727204279.28502: dumping result to json 22736 1727204279.28507: done dumping result, returning 22736 1727204279.28516: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12b410aa-8751-4f4a-548a-00000000006e] 22736 1727204279.28523: sending task result for task 12b410aa-8751-4f4a-548a-00000000006e 22736 1727204279.28613: done sending task result for task 12b410aa-8751-4f4a-548a-00000000006e 22736 1727204279.28616: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result.stderr_lines": [ "" ] } 22736 1727204279.28717: no more pending results, returning what we have 22736 1727204279.28721: results queue empty 22736 1727204279.28722: checking for any_errors_fatal 22736 1727204279.28728: done checking for any_errors_fatal 22736 1727204279.28728: checking for max_fail_percentage 22736 1727204279.28731: done checking for max_fail_percentage 22736 1727204279.28732: checking to see if all hosts have failed and the running result is not ok 22736 1727204279.28733: done checking to see if all hosts have failed 22736 1727204279.28733: getting the remaining hosts for this loop 22736 1727204279.28735: done getting the remaining hosts for this loop 22736 1727204279.28739: getting the next task for host managed-node2 22736 1727204279.28745: done getting next task for host managed-node2 22736 1727204279.28750: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22736 1727204279.28752: ^ state is: HOST STATE: block=2, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204279.28764: getting variables 22736 1727204279.28766: in VariableManager get_vars() 22736 1727204279.28804: Calling all_inventory to load vars for managed-node2 22736 1727204279.28807: Calling groups_inventory to load vars for managed-node2 22736 1727204279.28810: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204279.28820: Calling all_plugins_play to load vars for managed-node2 22736 1727204279.28823: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204279.28826: Calling groups_plugins_play to load vars for managed-node2 22736 1727204279.30622: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204279.32174: done with get_vars() 22736 1727204279.32200: done getting variables 22736 1727204279.32253: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Tuesday 24 September 2024 14:57:59 -0400 (0:00:00.060) 0:00:44.107 ***** 22736 1727204279.32279: entering _queue_task() for managed-node2/debug 22736 1727204279.32548: worker is 1 (out of 1 available) 22736 1727204279.32564: exiting _queue_task() for managed-node2/debug 22736 1727204279.32578: done queuing things up, now waiting for results queue to drain 22736 1727204279.32579: waiting for pending results... 22736 1727204279.32776: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 22736 1727204279.32860: in run() - task 12b410aa-8751-4f4a-548a-00000000006f 22736 1727204279.32874: variable 'ansible_search_path' from source: unknown 22736 1727204279.32878: variable 'ansible_search_path' from source: unknown 22736 1727204279.32918: calling self._execute() 22736 1727204279.32995: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204279.33002: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204279.33013: variable 'omit' from source: magic vars 22736 1727204279.33343: variable 'ansible_distribution_major_version' from source: facts 22736 1727204279.33356: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204279.33364: variable 'omit' from source: magic vars 22736 1727204279.33403: variable 'omit' from source: magic vars 22736 1727204279.33438: variable 'omit' from source: magic vars 22736 1727204279.33475: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204279.33510: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204279.33530: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204279.33547: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204279.33560: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204279.33593: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204279.33597: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204279.33600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204279.33687: Set connection var ansible_timeout to 10 22736 1727204279.33701: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204279.33710: Set connection var ansible_shell_executable to /bin/sh 22736 1727204279.33713: Set connection var ansible_shell_type to sh 22736 1727204279.33719: Set connection var ansible_pipelining to False 22736 1727204279.33726: Set connection var ansible_connection to ssh 22736 1727204279.33745: variable 'ansible_shell_executable' from source: unknown 22736 1727204279.33748: variable 'ansible_connection' from source: unknown 22736 1727204279.33751: variable 'ansible_module_compression' from source: unknown 22736 1727204279.33755: variable 'ansible_shell_type' from source: unknown 22736 1727204279.33759: variable 'ansible_shell_executable' from source: unknown 22736 1727204279.33763: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204279.33768: variable 'ansible_pipelining' from source: unknown 22736 1727204279.33772: variable 'ansible_timeout' from source: unknown 22736 1727204279.33777: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204279.33901: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204279.33995: variable 'omit' from source: magic vars 22736 1727204279.33999: starting attempt loop 22736 1727204279.34003: running the handler 22736 1727204279.34006: variable '__network_connections_result' from source: set_fact 22736 1727204279.34040: variable '__network_connections_result' from source: set_fact 22736 1727204279.34135: handler run complete 22736 1727204279.34158: attempt loop complete, returning result 22736 1727204279.34161: _execute() done 22736 1727204279.34164: dumping result to json 22736 1727204279.34170: done dumping result, returning 22736 1727204279.34178: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12b410aa-8751-4f4a-548a-00000000006f] 22736 1727204279.34181: sending task result for task 12b410aa-8751-4f4a-548a-00000000006f 22736 1727204279.34278: done sending task result for task 12b410aa-8751-4f4a-548a-00000000006f 22736 1727204279.34281: WORKER PROCESS EXITING ok: [managed-node2] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "lsr27", "persistent_state": "absent" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 22736 1727204279.34377: no more pending results, returning what we have 22736 1727204279.34381: results queue empty 22736 1727204279.34382: checking for any_errors_fatal 22736 1727204279.34391: done checking for any_errors_fatal 22736 1727204279.34392: checking for max_fail_percentage 22736 1727204279.34393: done checking for max_fail_percentage 22736 1727204279.34395: checking to see if all hosts have failed and the running result is not ok 22736 1727204279.34396: done checking to see if all hosts have failed 22736 1727204279.34396: getting the remaining hosts for this loop 22736 1727204279.34398: done getting the remaining hosts for this loop 22736 1727204279.34403: getting the next task for host managed-node2 22736 1727204279.34410: done getting next task for host managed-node2 22736 1727204279.34414: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22736 1727204279.34416: ^ state is: HOST STATE: block=2, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204279.34427: getting variables 22736 1727204279.34429: in VariableManager get_vars() 22736 1727204279.34463: Calling all_inventory to load vars for managed-node2 22736 1727204279.34466: Calling groups_inventory to load vars for managed-node2 22736 1727204279.34469: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204279.34479: Calling all_plugins_play to load vars for managed-node2 22736 1727204279.34482: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204279.34485: Calling groups_plugins_play to load vars for managed-node2 22736 1727204279.35697: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204279.40824: done with get_vars() 22736 1727204279.40847: done getting variables 22736 1727204279.40891: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Tuesday 24 September 2024 14:57:59 -0400 (0:00:00.086) 0:00:44.194 ***** 22736 1727204279.40918: entering _queue_task() for managed-node2/debug 22736 1727204279.41199: worker is 1 (out of 1 available) 22736 1727204279.41215: exiting _queue_task() for managed-node2/debug 22736 1727204279.41229: done queuing things up, now waiting for results queue to drain 22736 1727204279.41232: waiting for pending results... 22736 1727204279.41442: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 22736 1727204279.41526: in run() - task 12b410aa-8751-4f4a-548a-000000000070 22736 1727204279.41540: variable 'ansible_search_path' from source: unknown 22736 1727204279.41544: variable 'ansible_search_path' from source: unknown 22736 1727204279.41579: calling self._execute() 22736 1727204279.41664: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204279.41671: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204279.41681: variable 'omit' from source: magic vars 22736 1727204279.42016: variable 'ansible_distribution_major_version' from source: facts 22736 1727204279.42028: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204279.42131: variable 'network_state' from source: role '' defaults 22736 1727204279.42141: Evaluated conditional (network_state != {}): False 22736 1727204279.42145: when evaluation is False, skipping this task 22736 1727204279.42149: _execute() done 22736 1727204279.42152: dumping result to json 22736 1727204279.42157: done dumping result, returning 22736 1727204279.42165: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12b410aa-8751-4f4a-548a-000000000070] 22736 1727204279.42172: sending task result for task 12b410aa-8751-4f4a-548a-000000000070 22736 1727204279.42274: done sending task result for task 12b410aa-8751-4f4a-548a-000000000070 22736 1727204279.42277: WORKER PROCESS EXITING skipping: [managed-node2] => { "false_condition": "network_state != {}" } 22736 1727204279.42332: no more pending results, returning what we have 22736 1727204279.42337: results queue empty 22736 1727204279.42338: checking for any_errors_fatal 22736 1727204279.42346: done checking for any_errors_fatal 22736 1727204279.42346: checking for max_fail_percentage 22736 1727204279.42348: done checking for max_fail_percentage 22736 1727204279.42350: checking to see if all hosts have failed and the running result is not ok 22736 1727204279.42351: done checking to see if all hosts have failed 22736 1727204279.42352: getting the remaining hosts for this loop 22736 1727204279.42354: done getting the remaining hosts for this loop 22736 1727204279.42358: getting the next task for host managed-node2 22736 1727204279.42364: done getting next task for host managed-node2 22736 1727204279.42368: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 22736 1727204279.42370: ^ state is: HOST STATE: block=2, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204279.42385: getting variables 22736 1727204279.42387: in VariableManager get_vars() 22736 1727204279.42425: Calling all_inventory to load vars for managed-node2 22736 1727204279.42429: Calling groups_inventory to load vars for managed-node2 22736 1727204279.42431: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204279.42441: Calling all_plugins_play to load vars for managed-node2 22736 1727204279.42444: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204279.42447: Calling groups_plugins_play to load vars for managed-node2 22736 1727204279.43633: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204279.45222: done with get_vars() 22736 1727204279.45245: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Tuesday 24 September 2024 14:57:59 -0400 (0:00:00.044) 0:00:44.238 ***** 22736 1727204279.45324: entering _queue_task() for managed-node2/ping 22736 1727204279.45569: worker is 1 (out of 1 available) 22736 1727204279.45585: exiting _queue_task() for managed-node2/ping 22736 1727204279.45599: done queuing things up, now waiting for results queue to drain 22736 1727204279.45601: waiting for pending results... 22736 1727204279.45788: running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity 22736 1727204279.45870: in run() - task 12b410aa-8751-4f4a-548a-000000000071 22736 1727204279.45883: variable 'ansible_search_path' from source: unknown 22736 1727204279.45886: variable 'ansible_search_path' from source: unknown 22736 1727204279.45943: calling self._execute() 22736 1727204279.46005: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204279.46012: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204279.46023: variable 'omit' from source: magic vars 22736 1727204279.46347: variable 'ansible_distribution_major_version' from source: facts 22736 1727204279.46359: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204279.46365: variable 'omit' from source: magic vars 22736 1727204279.46405: variable 'omit' from source: magic vars 22736 1727204279.46437: variable 'omit' from source: magic vars 22736 1727204279.46470: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204279.46508: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204279.46527: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204279.46543: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204279.46555: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204279.46583: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204279.46589: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204279.46592: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204279.46679: Set connection var ansible_timeout to 10 22736 1727204279.46691: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204279.46701: Set connection var ansible_shell_executable to /bin/sh 22736 1727204279.46705: Set connection var ansible_shell_type to sh 22736 1727204279.46715: Set connection var ansible_pipelining to False 22736 1727204279.46720: Set connection var ansible_connection to ssh 22736 1727204279.46738: variable 'ansible_shell_executable' from source: unknown 22736 1727204279.46741: variable 'ansible_connection' from source: unknown 22736 1727204279.46744: variable 'ansible_module_compression' from source: unknown 22736 1727204279.46746: variable 'ansible_shell_type' from source: unknown 22736 1727204279.46752: variable 'ansible_shell_executable' from source: unknown 22736 1727204279.46756: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204279.46761: variable 'ansible_pipelining' from source: unknown 22736 1727204279.46764: variable 'ansible_timeout' from source: unknown 22736 1727204279.46769: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204279.46945: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204279.46956: variable 'omit' from source: magic vars 22736 1727204279.46959: starting attempt loop 22736 1727204279.46964: running the handler 22736 1727204279.46979: _low_level_execute_command(): starting 22736 1727204279.46986: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204279.47542: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204279.47546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204279.47550: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204279.47553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204279.47615: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204279.47622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204279.47663: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204279.49470: stdout chunk (state=3): >>>/root <<< 22736 1727204279.49676: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204279.49680: stdout chunk (state=3): >>><<< 22736 1727204279.49682: stderr chunk (state=3): >>><<< 22736 1727204279.49712: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204279.49738: _low_level_execute_command(): starting 22736 1727204279.49836: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204279.4972262-25043-205198434155965 `" && echo ansible-tmp-1727204279.4972262-25043-205198434155965="` echo /root/.ansible/tmp/ansible-tmp-1727204279.4972262-25043-205198434155965 `" ) && sleep 0' 22736 1727204279.50469: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204279.50507: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204279.50585: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204279.52706: stdout chunk (state=3): >>>ansible-tmp-1727204279.4972262-25043-205198434155965=/root/.ansible/tmp/ansible-tmp-1727204279.4972262-25043-205198434155965 <<< 22736 1727204279.53109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204279.53112: stdout chunk (state=3): >>><<< 22736 1727204279.53114: stderr chunk (state=3): >>><<< 22736 1727204279.53116: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204279.4972262-25043-205198434155965=/root/.ansible/tmp/ansible-tmp-1727204279.4972262-25043-205198434155965 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204279.53118: variable 'ansible_module_compression' from source: unknown 22736 1727204279.53148: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 22736 1727204279.53299: variable 'ansible_facts' from source: unknown 22736 1727204279.53393: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204279.4972262-25043-205198434155965/AnsiballZ_ping.py 22736 1727204279.53650: Sending initial data 22736 1727204279.53654: Sent initial data (153 bytes) 22736 1727204279.54276: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204279.54298: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204279.54376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204279.56141: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204279.56209: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204279.56242: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp65v2fk_2 /root/.ansible/tmp/ansible-tmp-1727204279.4972262-25043-205198434155965/AnsiballZ_ping.py <<< 22736 1727204279.56269: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204279.4972262-25043-205198434155965/AnsiballZ_ping.py" <<< 22736 1727204279.56310: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp65v2fk_2" to remote "/root/.ansible/tmp/ansible-tmp-1727204279.4972262-25043-205198434155965/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204279.4972262-25043-205198434155965/AnsiballZ_ping.py" <<< 22736 1727204279.57370: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204279.57412: stderr chunk (state=3): >>><<< 22736 1727204279.57423: stdout chunk (state=3): >>><<< 22736 1727204279.57459: done transferring module to remote 22736 1727204279.57478: _low_level_execute_command(): starting 22736 1727204279.57574: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204279.4972262-25043-205198434155965/ /root/.ansible/tmp/ansible-tmp-1727204279.4972262-25043-205198434155965/AnsiballZ_ping.py && sleep 0' 22736 1727204279.58237: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204279.58241: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204279.58243: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204279.58246: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204279.58248: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204279.58251: stderr chunk (state=3): >>>debug2: match not found <<< 22736 1727204279.58312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204279.58360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204279.58381: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204279.58400: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204279.58476: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204279.60599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204279.60603: stdout chunk (state=3): >>><<< 22736 1727204279.60605: stderr chunk (state=3): >>><<< 22736 1727204279.60608: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204279.60615: _low_level_execute_command(): starting 22736 1727204279.60617: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204279.4972262-25043-205198434155965/AnsiballZ_ping.py && sleep 0' 22736 1727204279.61207: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204279.61217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204279.61233: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204279.61255: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204279.61269: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204279.61277: stderr chunk (state=3): >>>debug2: match not found <<< 22736 1727204279.61287: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204279.61305: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22736 1727204279.61314: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 22736 1727204279.61369: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204279.61421: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204279.61437: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204279.61484: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204279.61537: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204279.78919: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 22736 1727204279.80429: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204279.80434: stderr chunk (state=3): >>><<< 22736 1727204279.80436: stdout chunk (state=3): >>><<< 22736 1727204279.80697: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204279.80702: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204279.4972262-25043-205198434155965/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204279.80705: _low_level_execute_command(): starting 22736 1727204279.80707: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204279.4972262-25043-205198434155965/ > /dev/null 2>&1 && sleep 0' 22736 1727204279.81112: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204279.81161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204279.81201: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204279.81232: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204279.81265: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204279.83272: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204279.83325: stderr chunk (state=3): >>><<< 22736 1727204279.83328: stdout chunk (state=3): >>><<< 22736 1727204279.83344: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204279.83351: handler run complete 22736 1727204279.83368: attempt loop complete, returning result 22736 1727204279.83372: _execute() done 22736 1727204279.83375: dumping result to json 22736 1727204279.83379: done dumping result, returning 22736 1727204279.83390: done running TaskExecutor() for managed-node2/TASK: fedora.linux_system_roles.network : Re-test connectivity [12b410aa-8751-4f4a-548a-000000000071] 22736 1727204279.83393: sending task result for task 12b410aa-8751-4f4a-548a-000000000071 22736 1727204279.83491: done sending task result for task 12b410aa-8751-4f4a-548a-000000000071 22736 1727204279.83495: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "ping": "pong" } 22736 1727204279.83567: no more pending results, returning what we have 22736 1727204279.83571: results queue empty 22736 1727204279.83572: checking for any_errors_fatal 22736 1727204279.83580: done checking for any_errors_fatal 22736 1727204279.83581: checking for max_fail_percentage 22736 1727204279.83582: done checking for max_fail_percentage 22736 1727204279.83583: checking to see if all hosts have failed and the running result is not ok 22736 1727204279.83585: done checking to see if all hosts have failed 22736 1727204279.83585: getting the remaining hosts for this loop 22736 1727204279.83587: done getting the remaining hosts for this loop 22736 1727204279.83594: getting the next task for host managed-node2 22736 1727204279.83603: done getting next task for host managed-node2 22736 1727204279.83612: ^ task is: TASK: meta (role_complete) 22736 1727204279.83615: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204279.83629: getting variables 22736 1727204279.83631: in VariableManager get_vars() 22736 1727204279.83672: Calling all_inventory to load vars for managed-node2 22736 1727204279.83675: Calling groups_inventory to load vars for managed-node2 22736 1727204279.83677: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204279.83691: Calling all_plugins_play to load vars for managed-node2 22736 1727204279.83695: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204279.83698: Calling groups_plugins_play to load vars for managed-node2 22736 1727204279.85611: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204279.87194: done with get_vars() 22736 1727204279.87220: done getting variables 22736 1727204279.87292: done queuing things up, now waiting for results queue to drain 22736 1727204279.87294: results queue empty 22736 1727204279.87295: checking for any_errors_fatal 22736 1727204279.87297: done checking for any_errors_fatal 22736 1727204279.87298: checking for max_fail_percentage 22736 1727204279.87299: done checking for max_fail_percentage 22736 1727204279.87299: checking to see if all hosts have failed and the running result is not ok 22736 1727204279.87300: done checking to see if all hosts have failed 22736 1727204279.87301: getting the remaining hosts for this loop 22736 1727204279.87301: done getting the remaining hosts for this loop 22736 1727204279.87303: getting the next task for host managed-node2 22736 1727204279.87306: done getting next task for host managed-node2 22736 1727204279.87308: ^ task is: TASK: meta (flush_handlers) 22736 1727204279.87309: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204279.87311: getting variables 22736 1727204279.87312: in VariableManager get_vars() 22736 1727204279.87325: Calling all_inventory to load vars for managed-node2 22736 1727204279.87327: Calling groups_inventory to load vars for managed-node2 22736 1727204279.87328: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204279.87333: Calling all_plugins_play to load vars for managed-node2 22736 1727204279.87335: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204279.87337: Calling groups_plugins_play to load vars for managed-node2 22736 1727204279.88428: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204279.90101: done with get_vars() 22736 1727204279.90124: done getting variables 22736 1727204279.90169: in VariableManager get_vars() 22736 1727204279.90180: Calling all_inventory to load vars for managed-node2 22736 1727204279.90182: Calling groups_inventory to load vars for managed-node2 22736 1727204279.90184: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204279.90188: Calling all_plugins_play to load vars for managed-node2 22736 1727204279.90191: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204279.90194: Calling groups_plugins_play to load vars for managed-node2 22736 1727204279.91266: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204279.92837: done with get_vars() 22736 1727204279.92863: done queuing things up, now waiting for results queue to drain 22736 1727204279.92865: results queue empty 22736 1727204279.92866: checking for any_errors_fatal 22736 1727204279.92868: done checking for any_errors_fatal 22736 1727204279.92868: checking for max_fail_percentage 22736 1727204279.92869: done checking for max_fail_percentage 22736 1727204279.92870: checking to see if all hosts have failed and the running result is not ok 22736 1727204279.92870: done checking to see if all hosts have failed 22736 1727204279.92871: getting the remaining hosts for this loop 22736 1727204279.92872: done getting the remaining hosts for this loop 22736 1727204279.92874: getting the next task for host managed-node2 22736 1727204279.92880: done getting next task for host managed-node2 22736 1727204279.92881: ^ task is: TASK: meta (flush_handlers) 22736 1727204279.92883: ^ state is: HOST STATE: block=5, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204279.92885: getting variables 22736 1727204279.92886: in VariableManager get_vars() 22736 1727204279.92897: Calling all_inventory to load vars for managed-node2 22736 1727204279.92899: Calling groups_inventory to load vars for managed-node2 22736 1727204279.92901: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204279.92905: Calling all_plugins_play to load vars for managed-node2 22736 1727204279.92907: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204279.92909: Calling groups_plugins_play to load vars for managed-node2 22736 1727204279.94025: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204279.95567: done with get_vars() 22736 1727204279.95592: done getting variables 22736 1727204279.95635: in VariableManager get_vars() 22736 1727204279.95644: Calling all_inventory to load vars for managed-node2 22736 1727204279.95646: Calling groups_inventory to load vars for managed-node2 22736 1727204279.95648: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204279.95651: Calling all_plugins_play to load vars for managed-node2 22736 1727204279.95653: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204279.95655: Calling groups_plugins_play to load vars for managed-node2 22736 1727204279.96725: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204279.98372: done with get_vars() 22736 1727204279.98398: done queuing things up, now waiting for results queue to drain 22736 1727204279.98401: results queue empty 22736 1727204279.98402: checking for any_errors_fatal 22736 1727204279.98403: done checking for any_errors_fatal 22736 1727204279.98403: checking for max_fail_percentage 22736 1727204279.98404: done checking for max_fail_percentage 22736 1727204279.98405: checking to see if all hosts have failed and the running result is not ok 22736 1727204279.98410: done checking to see if all hosts have failed 22736 1727204279.98410: getting the remaining hosts for this loop 22736 1727204279.98411: done getting the remaining hosts for this loop 22736 1727204279.98413: getting the next task for host managed-node2 22736 1727204279.98416: done getting next task for host managed-node2 22736 1727204279.98416: ^ task is: None 22736 1727204279.98419: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204279.98420: done queuing things up, now waiting for results queue to drain 22736 1727204279.98421: results queue empty 22736 1727204279.98422: checking for any_errors_fatal 22736 1727204279.98422: done checking for any_errors_fatal 22736 1727204279.98423: checking for max_fail_percentage 22736 1727204279.98424: done checking for max_fail_percentage 22736 1727204279.98424: checking to see if all hosts have failed and the running result is not ok 22736 1727204279.98425: done checking to see if all hosts have failed 22736 1727204279.98426: getting the next task for host managed-node2 22736 1727204279.98427: done getting next task for host managed-node2 22736 1727204279.98428: ^ task is: None 22736 1727204279.98429: ^ state is: HOST STATE: block=6, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204279.98467: in VariableManager get_vars() 22736 1727204279.98480: done with get_vars() 22736 1727204279.98484: in VariableManager get_vars() 22736 1727204279.98494: done with get_vars() 22736 1727204279.98497: variable 'omit' from source: magic vars 22736 1727204279.98527: in VariableManager get_vars() 22736 1727204279.98535: done with get_vars() 22736 1727204279.98552: variable 'omit' from source: magic vars PLAY [Assert device and profile are absent] ************************************ 22736 1727204279.98703: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22736 1727204279.98726: getting the remaining hosts for this loop 22736 1727204279.98727: done getting the remaining hosts for this loop 22736 1727204279.98731: getting the next task for host managed-node2 22736 1727204279.98734: done getting next task for host managed-node2 22736 1727204279.98735: ^ task is: TASK: Gathering Facts 22736 1727204279.98736: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204279.98738: getting variables 22736 1727204279.98739: in VariableManager get_vars() 22736 1727204279.98745: Calling all_inventory to load vars for managed-node2 22736 1727204279.98747: Calling groups_inventory to load vars for managed-node2 22736 1727204279.98750: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204279.98754: Calling all_plugins_play to load vars for managed-node2 22736 1727204279.98756: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204279.98758: Calling groups_plugins_play to load vars for managed-node2 22736 1727204279.99831: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204280.01375: done with get_vars() 22736 1727204280.01404: done getting variables 22736 1727204280.01448: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:68 Tuesday 24 September 2024 14:58:00 -0400 (0:00:00.561) 0:00:44.799 ***** 22736 1727204280.01470: entering _queue_task() for managed-node2/gather_facts 22736 1727204280.01739: worker is 1 (out of 1 available) 22736 1727204280.01751: exiting _queue_task() for managed-node2/gather_facts 22736 1727204280.01763: done queuing things up, now waiting for results queue to drain 22736 1727204280.01765: waiting for pending results... 22736 1727204280.01963: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22736 1727204280.02057: in run() - task 12b410aa-8751-4f4a-548a-0000000004e4 22736 1727204280.02071: variable 'ansible_search_path' from source: unknown 22736 1727204280.02112: calling self._execute() 22736 1727204280.02188: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204280.02196: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204280.02206: variable 'omit' from source: magic vars 22736 1727204280.02554: variable 'ansible_distribution_major_version' from source: facts 22736 1727204280.02569: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204280.02575: variable 'omit' from source: magic vars 22736 1727204280.02604: variable 'omit' from source: magic vars 22736 1727204280.02637: variable 'omit' from source: magic vars 22736 1727204280.02677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204280.02713: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204280.02731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204280.02748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204280.02763: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204280.02796: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204280.02800: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204280.02803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204280.02896: Set connection var ansible_timeout to 10 22736 1727204280.02908: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204280.02916: Set connection var ansible_shell_executable to /bin/sh 22736 1727204280.02919: Set connection var ansible_shell_type to sh 22736 1727204280.02928: Set connection var ansible_pipelining to False 22736 1727204280.02931: Set connection var ansible_connection to ssh 22736 1727204280.02951: variable 'ansible_shell_executable' from source: unknown 22736 1727204280.02955: variable 'ansible_connection' from source: unknown 22736 1727204280.02958: variable 'ansible_module_compression' from source: unknown 22736 1727204280.02961: variable 'ansible_shell_type' from source: unknown 22736 1727204280.02964: variable 'ansible_shell_executable' from source: unknown 22736 1727204280.02968: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204280.02975: variable 'ansible_pipelining' from source: unknown 22736 1727204280.02978: variable 'ansible_timeout' from source: unknown 22736 1727204280.02991: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204280.03151: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204280.03163: variable 'omit' from source: magic vars 22736 1727204280.03168: starting attempt loop 22736 1727204280.03171: running the handler 22736 1727204280.03187: variable 'ansible_facts' from source: unknown 22736 1727204280.03212: _low_level_execute_command(): starting 22736 1727204280.03222: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204280.03771: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204280.03775: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204280.03779: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204280.03838: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204280.03842: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204280.03895: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204280.05666: stdout chunk (state=3): >>>/root <<< 22736 1727204280.05769: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204280.05834: stderr chunk (state=3): >>><<< 22736 1727204280.05838: stdout chunk (state=3): >>><<< 22736 1727204280.05866: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204280.05876: _low_level_execute_command(): starting 22736 1727204280.05884: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204280.0586362-25062-267864435758851 `" && echo ansible-tmp-1727204280.0586362-25062-267864435758851="` echo /root/.ansible/tmp/ansible-tmp-1727204280.0586362-25062-267864435758851 `" ) && sleep 0' 22736 1727204280.06378: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204280.06384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204280.06387: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 22736 1727204280.06409: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204280.06412: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204280.06448: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204280.06451: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204280.06501: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204280.08560: stdout chunk (state=3): >>>ansible-tmp-1727204280.0586362-25062-267864435758851=/root/.ansible/tmp/ansible-tmp-1727204280.0586362-25062-267864435758851 <<< 22736 1727204280.08677: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204280.08733: stderr chunk (state=3): >>><<< 22736 1727204280.08736: stdout chunk (state=3): >>><<< 22736 1727204280.08754: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204280.0586362-25062-267864435758851=/root/.ansible/tmp/ansible-tmp-1727204280.0586362-25062-267864435758851 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204280.08787: variable 'ansible_module_compression' from source: unknown 22736 1727204280.08837: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22736 1727204280.08894: variable 'ansible_facts' from source: unknown 22736 1727204280.09016: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204280.0586362-25062-267864435758851/AnsiballZ_setup.py 22736 1727204280.09148: Sending initial data 22736 1727204280.09153: Sent initial data (154 bytes) 22736 1727204280.09635: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204280.09641: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204280.09644: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204280.09646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204280.09649: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204280.09704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204280.09710: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204280.09748: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204280.11433: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204280.11470: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204280.11508: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpg5wdnu0d /root/.ansible/tmp/ansible-tmp-1727204280.0586362-25062-267864435758851/AnsiballZ_setup.py <<< 22736 1727204280.11515: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204280.0586362-25062-267864435758851/AnsiballZ_setup.py" <<< 22736 1727204280.11545: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpg5wdnu0d" to remote "/root/.ansible/tmp/ansible-tmp-1727204280.0586362-25062-267864435758851/AnsiballZ_setup.py" <<< 22736 1727204280.11552: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204280.0586362-25062-267864435758851/AnsiballZ_setup.py" <<< 22736 1727204280.13205: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204280.13283: stderr chunk (state=3): >>><<< 22736 1727204280.13287: stdout chunk (state=3): >>><<< 22736 1727204280.13312: done transferring module to remote 22736 1727204280.13328: _low_level_execute_command(): starting 22736 1727204280.13333: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204280.0586362-25062-267864435758851/ /root/.ansible/tmp/ansible-tmp-1727204280.0586362-25062-267864435758851/AnsiballZ_setup.py && sleep 0' 22736 1727204280.13821: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204280.13824: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204280.13827: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204280.13829: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204280.13897: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204280.13900: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204280.13933: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204280.15874: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204280.15933: stderr chunk (state=3): >>><<< 22736 1727204280.15937: stdout chunk (state=3): >>><<< 22736 1727204280.15955: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204280.15958: _low_level_execute_command(): starting 22736 1727204280.15964: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204280.0586362-25062-267864435758851/AnsiballZ_setup.py && sleep 0' 22736 1727204280.16455: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204280.16459: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204280.16461: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204280.16464: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204280.16466: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204280.16520: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204280.16523: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204280.16574: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204280.85901: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "58", "second": "00", "epoch": "1727204280", "epoch_int": "1727204280", "date": "2024-09-24", "time": "14:58:00", "iso8601_micro": "2024-09-24T18:58:00.486805Z", "iso8601": "2024-09-24T18:58:00Z", "iso8601_basic": "20240924T145800486805", "iso8601_basic_short": "20240924T145800", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.8359375, "5m": 0.6708984375, "15m": 0.416015625}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c4<<< 22736 1727204280.85924: stdout chunk (state=3): >>>47b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_hostnqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2840, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 877, "free": 2840}, "nocache": {"free": 3471, "used": 246}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 784, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146801152, "block_size": 4096, "block_total": 64479564, "block_available": 61315137, "block_used": 3164427, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22736 1727204280.88041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204280.88103: stderr chunk (state=3): >>><<< 22736 1727204280.88106: stdout chunk (state=3): >>><<< 22736 1727204280.88137: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_local": {}, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "58", "second": "00", "epoch": "1727204280", "epoch_int": "1727204280", "date": "2024-09-24", "time": "14:58:00", "iso8601_micro": "2024-09-24T18:58:00.486805Z", "iso8601": "2024-09-24T18:58:00Z", "iso8601_basic": "20240924T145800486805", "iso8601_basic_short": "20240924T145800", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.8359375, "5m": 0.6708984375, "15m": 0.416015625}, "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_iscsi_iqn": "", "ansible_fibre_channel_wwn": [], "ansible_apparmor": {"status": "disabled"}, "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_lsb": {}, "ansible_service_mgr": "systemd", "ansible_hostnqn": "", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2840, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 877, "free": 2840}, "nocache": {"free": 3471, "used": 246}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 784, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146801152, "block_size": 4096, "block_total": 64479564, "block_available": 61315137, "block_used": 3164427, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204280.88399: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204280.0586362-25062-267864435758851/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204280.88421: _low_level_execute_command(): starting 22736 1727204280.88424: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204280.0586362-25062-267864435758851/ > /dev/null 2>&1 && sleep 0' 22736 1727204280.88925: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204280.88929: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204280.88933: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204280.88935: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204280.88938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204280.88994: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204280.88997: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204280.89002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204280.89042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204280.91029: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204280.91087: stderr chunk (state=3): >>><<< 22736 1727204280.91095: stdout chunk (state=3): >>><<< 22736 1727204280.91113: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204280.91123: handler run complete 22736 1727204280.91231: variable 'ansible_facts' from source: unknown 22736 1727204280.91319: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204280.91581: variable 'ansible_facts' from source: unknown 22736 1727204280.91651: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204280.91754: attempt loop complete, returning result 22736 1727204280.91757: _execute() done 22736 1727204280.91762: dumping result to json 22736 1727204280.91786: done dumping result, returning 22736 1727204280.91795: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-4f4a-548a-0000000004e4] 22736 1727204280.91798: sending task result for task 12b410aa-8751-4f4a-548a-0000000004e4 22736 1727204280.92067: done sending task result for task 12b410aa-8751-4f4a-548a-0000000004e4 22736 1727204280.92071: WORKER PROCESS EXITING ok: [managed-node2] 22736 1727204280.92361: no more pending results, returning what we have 22736 1727204280.92364: results queue empty 22736 1727204280.92365: checking for any_errors_fatal 22736 1727204280.92366: done checking for any_errors_fatal 22736 1727204280.92366: checking for max_fail_percentage 22736 1727204280.92368: done checking for max_fail_percentage 22736 1727204280.92368: checking to see if all hosts have failed and the running result is not ok 22736 1727204280.92369: done checking to see if all hosts have failed 22736 1727204280.92370: getting the remaining hosts for this loop 22736 1727204280.92371: done getting the remaining hosts for this loop 22736 1727204280.92373: getting the next task for host managed-node2 22736 1727204280.92378: done getting next task for host managed-node2 22736 1727204280.92379: ^ task is: TASK: meta (flush_handlers) 22736 1727204280.92381: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204280.92384: getting variables 22736 1727204280.92385: in VariableManager get_vars() 22736 1727204280.92409: Calling all_inventory to load vars for managed-node2 22736 1727204280.92412: Calling groups_inventory to load vars for managed-node2 22736 1727204280.92414: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204280.92426: Calling all_plugins_play to load vars for managed-node2 22736 1727204280.92428: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204280.92431: Calling groups_plugins_play to load vars for managed-node2 22736 1727204280.93695: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204280.95263: done with get_vars() 22736 1727204280.95288: done getting variables 22736 1727204280.95353: in VariableManager get_vars() 22736 1727204280.95362: Calling all_inventory to load vars for managed-node2 22736 1727204280.95364: Calling groups_inventory to load vars for managed-node2 22736 1727204280.95366: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204280.95370: Calling all_plugins_play to load vars for managed-node2 22736 1727204280.95371: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204280.95374: Calling groups_plugins_play to load vars for managed-node2 22736 1727204280.96551: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204280.98129: done with get_vars() 22736 1727204280.98162: done queuing things up, now waiting for results queue to drain 22736 1727204280.98164: results queue empty 22736 1727204280.98164: checking for any_errors_fatal 22736 1727204280.98169: done checking for any_errors_fatal 22736 1727204280.98170: checking for max_fail_percentage 22736 1727204280.98171: done checking for max_fail_percentage 22736 1727204280.98176: checking to see if all hosts have failed and the running result is not ok 22736 1727204280.98177: done checking to see if all hosts have failed 22736 1727204280.98178: getting the remaining hosts for this loop 22736 1727204280.98179: done getting the remaining hosts for this loop 22736 1727204280.98181: getting the next task for host managed-node2 22736 1727204280.98184: done getting next task for host managed-node2 22736 1727204280.98186: ^ task is: TASK: Include the task 'assert_profile_absent.yml' 22736 1727204280.98188: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204280.98191: getting variables 22736 1727204280.98192: in VariableManager get_vars() 22736 1727204280.98200: Calling all_inventory to load vars for managed-node2 22736 1727204280.98204: Calling groups_inventory to load vars for managed-node2 22736 1727204280.98206: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204280.98211: Calling all_plugins_play to load vars for managed-node2 22736 1727204280.98213: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204280.98215: Calling groups_plugins_play to load vars for managed-node2 22736 1727204280.99309: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204281.00885: done with get_vars() 22736 1727204281.00909: done getting variables TASK [Include the task 'assert_profile_absent.yml'] **************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:71 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.995) 0:00:45.794 ***** 22736 1727204281.00978: entering _queue_task() for managed-node2/include_tasks 22736 1727204281.01268: worker is 1 (out of 1 available) 22736 1727204281.01283: exiting _queue_task() for managed-node2/include_tasks 22736 1727204281.01299: done queuing things up, now waiting for results queue to drain 22736 1727204281.01300: waiting for pending results... 22736 1727204281.01493: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_absent.yml' 22736 1727204281.01579: in run() - task 12b410aa-8751-4f4a-548a-000000000074 22736 1727204281.01594: variable 'ansible_search_path' from source: unknown 22736 1727204281.01631: calling self._execute() 22736 1727204281.01711: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204281.01720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204281.01731: variable 'omit' from source: magic vars 22736 1727204281.02061: variable 'ansible_distribution_major_version' from source: facts 22736 1727204281.02077: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204281.02082: _execute() done 22736 1727204281.02086: dumping result to json 22736 1727204281.02089: done dumping result, returning 22736 1727204281.02192: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_profile_absent.yml' [12b410aa-8751-4f4a-548a-000000000074] 22736 1727204281.02195: sending task result for task 12b410aa-8751-4f4a-548a-000000000074 22736 1727204281.02278: done sending task result for task 12b410aa-8751-4f4a-548a-000000000074 22736 1727204281.02281: WORKER PROCESS EXITING 22736 1727204281.02321: no more pending results, returning what we have 22736 1727204281.02326: in VariableManager get_vars() 22736 1727204281.02358: Calling all_inventory to load vars for managed-node2 22736 1727204281.02361: Calling groups_inventory to load vars for managed-node2 22736 1727204281.02364: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204281.02376: Calling all_plugins_play to load vars for managed-node2 22736 1727204281.02379: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204281.02382: Calling groups_plugins_play to load vars for managed-node2 22736 1727204281.03641: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204281.05226: done with get_vars() 22736 1727204281.05254: variable 'ansible_search_path' from source: unknown 22736 1727204281.05268: we have included files to process 22736 1727204281.05269: generating all_blocks data 22736 1727204281.05270: done generating all_blocks data 22736 1727204281.05271: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 22736 1727204281.05272: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 22736 1727204281.05274: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml 22736 1727204281.05422: in VariableManager get_vars() 22736 1727204281.05436: done with get_vars() 22736 1727204281.05536: done processing included file 22736 1727204281.05537: iterating over new_blocks loaded from include file 22736 1727204281.05539: in VariableManager get_vars() 22736 1727204281.05548: done with get_vars() 22736 1727204281.05550: filtering new block on tags 22736 1727204281.05564: done filtering new block on tags 22736 1727204281.05566: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml for managed-node2 22736 1727204281.05571: extending task lists for all hosts with included blocks 22736 1727204281.05614: done extending task lists 22736 1727204281.05615: done processing included files 22736 1727204281.05616: results queue empty 22736 1727204281.05616: checking for any_errors_fatal 22736 1727204281.05619: done checking for any_errors_fatal 22736 1727204281.05620: checking for max_fail_percentage 22736 1727204281.05621: done checking for max_fail_percentage 22736 1727204281.05622: checking to see if all hosts have failed and the running result is not ok 22736 1727204281.05622: done checking to see if all hosts have failed 22736 1727204281.05623: getting the remaining hosts for this loop 22736 1727204281.05624: done getting the remaining hosts for this loop 22736 1727204281.05626: getting the next task for host managed-node2 22736 1727204281.05629: done getting next task for host managed-node2 22736 1727204281.05630: ^ task is: TASK: Include the task 'get_profile_stat.yml' 22736 1727204281.05632: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204281.05634: getting variables 22736 1727204281.05635: in VariableManager get_vars() 22736 1727204281.05642: Calling all_inventory to load vars for managed-node2 22736 1727204281.05644: Calling groups_inventory to load vars for managed-node2 22736 1727204281.05645: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204281.05650: Calling all_plugins_play to load vars for managed-node2 22736 1727204281.05652: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204281.05654: Calling groups_plugins_play to load vars for managed-node2 22736 1727204281.06810: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204281.08370: done with get_vars() 22736 1727204281.08397: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:3 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.074) 0:00:45.869 ***** 22736 1727204281.08466: entering _queue_task() for managed-node2/include_tasks 22736 1727204281.08750: worker is 1 (out of 1 available) 22736 1727204281.08765: exiting _queue_task() for managed-node2/include_tasks 22736 1727204281.08778: done queuing things up, now waiting for results queue to drain 22736 1727204281.08780: waiting for pending results... 22736 1727204281.08972: running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' 22736 1727204281.09068: in run() - task 12b410aa-8751-4f4a-548a-0000000004f5 22736 1727204281.09080: variable 'ansible_search_path' from source: unknown 22736 1727204281.09084: variable 'ansible_search_path' from source: unknown 22736 1727204281.09121: calling self._execute() 22736 1727204281.09204: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204281.09213: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204281.09226: variable 'omit' from source: magic vars 22736 1727204281.09563: variable 'ansible_distribution_major_version' from source: facts 22736 1727204281.09580: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204281.09586: _execute() done 22736 1727204281.09592: dumping result to json 22736 1727204281.09598: done dumping result, returning 22736 1727204281.09605: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_profile_stat.yml' [12b410aa-8751-4f4a-548a-0000000004f5] 22736 1727204281.09611: sending task result for task 12b410aa-8751-4f4a-548a-0000000004f5 22736 1727204281.09706: done sending task result for task 12b410aa-8751-4f4a-548a-0000000004f5 22736 1727204281.09709: WORKER PROCESS EXITING 22736 1727204281.09741: no more pending results, returning what we have 22736 1727204281.09747: in VariableManager get_vars() 22736 1727204281.09782: Calling all_inventory to load vars for managed-node2 22736 1727204281.09785: Calling groups_inventory to load vars for managed-node2 22736 1727204281.09791: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204281.09809: Calling all_plugins_play to load vars for managed-node2 22736 1727204281.09813: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204281.09816: Calling groups_plugins_play to load vars for managed-node2 22736 1727204281.11088: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204281.12672: done with get_vars() 22736 1727204281.12698: variable 'ansible_search_path' from source: unknown 22736 1727204281.12699: variable 'ansible_search_path' from source: unknown 22736 1727204281.12737: we have included files to process 22736 1727204281.12739: generating all_blocks data 22736 1727204281.12740: done generating all_blocks data 22736 1727204281.12741: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 22736 1727204281.12742: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 22736 1727204281.12744: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 22736 1727204281.13651: done processing included file 22736 1727204281.13652: iterating over new_blocks loaded from include file 22736 1727204281.13654: in VariableManager get_vars() 22736 1727204281.13665: done with get_vars() 22736 1727204281.13667: filtering new block on tags 22736 1727204281.13685: done filtering new block on tags 22736 1727204281.13687: in VariableManager get_vars() 22736 1727204281.13698: done with get_vars() 22736 1727204281.13700: filtering new block on tags 22736 1727204281.13722: done filtering new block on tags 22736 1727204281.13724: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed-node2 22736 1727204281.13729: extending task lists for all hosts with included blocks 22736 1727204281.13812: done extending task lists 22736 1727204281.13813: done processing included files 22736 1727204281.13814: results queue empty 22736 1727204281.13814: checking for any_errors_fatal 22736 1727204281.13820: done checking for any_errors_fatal 22736 1727204281.13821: checking for max_fail_percentage 22736 1727204281.13822: done checking for max_fail_percentage 22736 1727204281.13823: checking to see if all hosts have failed and the running result is not ok 22736 1727204281.13823: done checking to see if all hosts have failed 22736 1727204281.13824: getting the remaining hosts for this loop 22736 1727204281.13825: done getting the remaining hosts for this loop 22736 1727204281.13827: getting the next task for host managed-node2 22736 1727204281.13831: done getting next task for host managed-node2 22736 1727204281.13832: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 22736 1727204281.13835: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204281.13837: getting variables 22736 1727204281.13838: in VariableManager get_vars() 22736 1727204281.13986: Calling all_inventory to load vars for managed-node2 22736 1727204281.13991: Calling groups_inventory to load vars for managed-node2 22736 1727204281.13993: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204281.13999: Calling all_plugins_play to load vars for managed-node2 22736 1727204281.14001: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204281.14003: Calling groups_plugins_play to load vars for managed-node2 22736 1727204281.15064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204281.16637: done with get_vars() 22736 1727204281.16664: done getting variables 22736 1727204281.16706: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.082) 0:00:45.952 ***** 22736 1727204281.16737: entering _queue_task() for managed-node2/set_fact 22736 1727204281.17025: worker is 1 (out of 1 available) 22736 1727204281.17040: exiting _queue_task() for managed-node2/set_fact 22736 1727204281.17053: done queuing things up, now waiting for results queue to drain 22736 1727204281.17055: waiting for pending results... 22736 1727204281.17242: running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag 22736 1727204281.17341: in run() - task 12b410aa-8751-4f4a-548a-000000000502 22736 1727204281.17355: variable 'ansible_search_path' from source: unknown 22736 1727204281.17359: variable 'ansible_search_path' from source: unknown 22736 1727204281.17395: calling self._execute() 22736 1727204281.17484: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204281.17493: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204281.17512: variable 'omit' from source: magic vars 22736 1727204281.17842: variable 'ansible_distribution_major_version' from source: facts 22736 1727204281.17850: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204281.17857: variable 'omit' from source: magic vars 22736 1727204281.17898: variable 'omit' from source: magic vars 22736 1727204281.17936: variable 'omit' from source: magic vars 22736 1727204281.17973: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204281.18007: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204281.18028: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204281.18047: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204281.18061: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204281.18088: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204281.18093: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204281.18098: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204281.18187: Set connection var ansible_timeout to 10 22736 1727204281.18200: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204281.18211: Set connection var ansible_shell_executable to /bin/sh 22736 1727204281.18214: Set connection var ansible_shell_type to sh 22736 1727204281.18223: Set connection var ansible_pipelining to False 22736 1727204281.18225: Set connection var ansible_connection to ssh 22736 1727204281.18246: variable 'ansible_shell_executable' from source: unknown 22736 1727204281.18249: variable 'ansible_connection' from source: unknown 22736 1727204281.18253: variable 'ansible_module_compression' from source: unknown 22736 1727204281.18258: variable 'ansible_shell_type' from source: unknown 22736 1727204281.18261: variable 'ansible_shell_executable' from source: unknown 22736 1727204281.18263: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204281.18275: variable 'ansible_pipelining' from source: unknown 22736 1727204281.18279: variable 'ansible_timeout' from source: unknown 22736 1727204281.18282: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204281.18410: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204281.18424: variable 'omit' from source: magic vars 22736 1727204281.18430: starting attempt loop 22736 1727204281.18433: running the handler 22736 1727204281.18447: handler run complete 22736 1727204281.18457: attempt loop complete, returning result 22736 1727204281.18459: _execute() done 22736 1727204281.18464: dumping result to json 22736 1727204281.18469: done dumping result, returning 22736 1727204281.18476: done running TaskExecutor() for managed-node2/TASK: Initialize NM profile exist and ansible_managed comment flag [12b410aa-8751-4f4a-548a-000000000502] 22736 1727204281.18482: sending task result for task 12b410aa-8751-4f4a-548a-000000000502 22736 1727204281.18570: done sending task result for task 12b410aa-8751-4f4a-548a-000000000502 22736 1727204281.18573: WORKER PROCESS EXITING ok: [managed-node2] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 22736 1727204281.18655: no more pending results, returning what we have 22736 1727204281.18659: results queue empty 22736 1727204281.18660: checking for any_errors_fatal 22736 1727204281.18663: done checking for any_errors_fatal 22736 1727204281.18663: checking for max_fail_percentage 22736 1727204281.18665: done checking for max_fail_percentage 22736 1727204281.18666: checking to see if all hosts have failed and the running result is not ok 22736 1727204281.18667: done checking to see if all hosts have failed 22736 1727204281.18668: getting the remaining hosts for this loop 22736 1727204281.18670: done getting the remaining hosts for this loop 22736 1727204281.18674: getting the next task for host managed-node2 22736 1727204281.18682: done getting next task for host managed-node2 22736 1727204281.18685: ^ task is: TASK: Stat profile file 22736 1727204281.18691: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204281.18696: getting variables 22736 1727204281.18697: in VariableManager get_vars() 22736 1727204281.18726: Calling all_inventory to load vars for managed-node2 22736 1727204281.18730: Calling groups_inventory to load vars for managed-node2 22736 1727204281.18733: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204281.18745: Calling all_plugins_play to load vars for managed-node2 22736 1727204281.18748: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204281.18752: Calling groups_plugins_play to load vars for managed-node2 22736 1727204281.20077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204281.21646: done with get_vars() 22736 1727204281.21672: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.050) 0:00:46.002 ***** 22736 1727204281.21758: entering _queue_task() for managed-node2/stat 22736 1727204281.22036: worker is 1 (out of 1 available) 22736 1727204281.22051: exiting _queue_task() for managed-node2/stat 22736 1727204281.22063: done queuing things up, now waiting for results queue to drain 22736 1727204281.22065: waiting for pending results... 22736 1727204281.22258: running TaskExecutor() for managed-node2/TASK: Stat profile file 22736 1727204281.22350: in run() - task 12b410aa-8751-4f4a-548a-000000000503 22736 1727204281.22364: variable 'ansible_search_path' from source: unknown 22736 1727204281.22367: variable 'ansible_search_path' from source: unknown 22736 1727204281.22402: calling self._execute() 22736 1727204281.22486: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204281.22495: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204281.22505: variable 'omit' from source: magic vars 22736 1727204281.22837: variable 'ansible_distribution_major_version' from source: facts 22736 1727204281.22848: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204281.22860: variable 'omit' from source: magic vars 22736 1727204281.22902: variable 'omit' from source: magic vars 22736 1727204281.22988: variable 'profile' from source: include params 22736 1727204281.22998: variable 'interface' from source: set_fact 22736 1727204281.23062: variable 'interface' from source: set_fact 22736 1727204281.23082: variable 'omit' from source: magic vars 22736 1727204281.23122: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204281.23153: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204281.23173: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204281.23194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204281.23207: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204281.23234: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204281.23237: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204281.23242: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204281.23331: Set connection var ansible_timeout to 10 22736 1727204281.23341: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204281.23350: Set connection var ansible_shell_executable to /bin/sh 22736 1727204281.23353: Set connection var ansible_shell_type to sh 22736 1727204281.23360: Set connection var ansible_pipelining to False 22736 1727204281.23362: Set connection var ansible_connection to ssh 22736 1727204281.23382: variable 'ansible_shell_executable' from source: unknown 22736 1727204281.23385: variable 'ansible_connection' from source: unknown 22736 1727204281.23388: variable 'ansible_module_compression' from source: unknown 22736 1727204281.23393: variable 'ansible_shell_type' from source: unknown 22736 1727204281.23402: variable 'ansible_shell_executable' from source: unknown 22736 1727204281.23406: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204281.23409: variable 'ansible_pipelining' from source: unknown 22736 1727204281.23411: variable 'ansible_timeout' from source: unknown 22736 1727204281.23420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204281.23594: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204281.23605: variable 'omit' from source: magic vars 22736 1727204281.23612: starting attempt loop 22736 1727204281.23615: running the handler 22736 1727204281.23632: _low_level_execute_command(): starting 22736 1727204281.23639: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204281.24187: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204281.24204: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204281.24209: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.24261: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204281.24264: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204281.24269: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204281.24317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204281.26106: stdout chunk (state=3): >>>/root <<< 22736 1727204281.26208: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204281.26270: stderr chunk (state=3): >>><<< 22736 1727204281.26274: stdout chunk (state=3): >>><<< 22736 1727204281.26298: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204281.26310: _low_level_execute_command(): starting 22736 1727204281.26317: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204281.2629874-25080-30020951773777 `" && echo ansible-tmp-1727204281.2629874-25080-30020951773777="` echo /root/.ansible/tmp/ansible-tmp-1727204281.2629874-25080-30020951773777 `" ) && sleep 0' 22736 1727204281.26800: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204281.26804: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204281.26807: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 22736 1727204281.26820: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.26867: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204281.26870: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204281.26917: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204281.28977: stdout chunk (state=3): >>>ansible-tmp-1727204281.2629874-25080-30020951773777=/root/.ansible/tmp/ansible-tmp-1727204281.2629874-25080-30020951773777 <<< 22736 1727204281.29194: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204281.29198: stdout chunk (state=3): >>><<< 22736 1727204281.29201: stderr chunk (state=3): >>><<< 22736 1727204281.29396: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204281.2629874-25080-30020951773777=/root/.ansible/tmp/ansible-tmp-1727204281.2629874-25080-30020951773777 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204281.29400: variable 'ansible_module_compression' from source: unknown 22736 1727204281.29403: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 22736 1727204281.29428: variable 'ansible_facts' from source: unknown 22736 1727204281.29547: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204281.2629874-25080-30020951773777/AnsiballZ_stat.py 22736 1727204281.29773: Sending initial data 22736 1727204281.29777: Sent initial data (152 bytes) 22736 1727204281.30510: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.30560: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204281.30582: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204281.30618: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204281.30701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204281.32423: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204281.32457: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204281.32499: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp0vqt2ofk /root/.ansible/tmp/ansible-tmp-1727204281.2629874-25080-30020951773777/AnsiballZ_stat.py <<< 22736 1727204281.32502: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204281.2629874-25080-30020951773777/AnsiballZ_stat.py" <<< 22736 1727204281.32533: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp0vqt2ofk" to remote "/root/.ansible/tmp/ansible-tmp-1727204281.2629874-25080-30020951773777/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204281.2629874-25080-30020951773777/AnsiballZ_stat.py" <<< 22736 1727204281.33297: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204281.33372: stderr chunk (state=3): >>><<< 22736 1727204281.33376: stdout chunk (state=3): >>><<< 22736 1727204281.33400: done transferring module to remote 22736 1727204281.33412: _low_level_execute_command(): starting 22736 1727204281.33417: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204281.2629874-25080-30020951773777/ /root/.ansible/tmp/ansible-tmp-1727204281.2629874-25080-30020951773777/AnsiballZ_stat.py && sleep 0' 22736 1727204281.33898: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204281.33902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204281.33905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204281.33907: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204281.33910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.33961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204281.33965: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204281.34010: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204281.35983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204281.36041: stderr chunk (state=3): >>><<< 22736 1727204281.36045: stdout chunk (state=3): >>><<< 22736 1727204281.36060: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204281.36063: _low_level_execute_command(): starting 22736 1727204281.36072: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204281.2629874-25080-30020951773777/AnsiballZ_stat.py && sleep 0' 22736 1727204281.36564: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204281.36567: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204281.36570: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.36572: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204281.36574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.36631: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204281.36635: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204281.36688: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204281.54423: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 22736 1727204281.55902: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204281.55967: stderr chunk (state=3): >>><<< 22736 1727204281.55971: stdout chunk (state=3): >>><<< 22736 1727204281.55990: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204281.56032: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204281.2629874-25080-30020951773777/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204281.56049: _low_level_execute_command(): starting 22736 1727204281.56052: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204281.2629874-25080-30020951773777/ > /dev/null 2>&1 && sleep 0' 22736 1727204281.56674: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204281.56700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204281.56717: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204281.56737: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204281.56756: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204281.56770: stderr chunk (state=3): >>>debug2: match not found <<< 22736 1727204281.56786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.56886: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204281.56914: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204281.56993: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204281.58965: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204281.59029: stderr chunk (state=3): >>><<< 22736 1727204281.59034: stdout chunk (state=3): >>><<< 22736 1727204281.59053: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204281.59061: handler run complete 22736 1727204281.59082: attempt loop complete, returning result 22736 1727204281.59085: _execute() done 22736 1727204281.59090: dumping result to json 22736 1727204281.59097: done dumping result, returning 22736 1727204281.59105: done running TaskExecutor() for managed-node2/TASK: Stat profile file [12b410aa-8751-4f4a-548a-000000000503] 22736 1727204281.59109: sending task result for task 12b410aa-8751-4f4a-548a-000000000503 22736 1727204281.59221: done sending task result for task 12b410aa-8751-4f4a-548a-000000000503 22736 1727204281.59225: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 22736 1727204281.59301: no more pending results, returning what we have 22736 1727204281.59305: results queue empty 22736 1727204281.59306: checking for any_errors_fatal 22736 1727204281.59315: done checking for any_errors_fatal 22736 1727204281.59316: checking for max_fail_percentage 22736 1727204281.59318: done checking for max_fail_percentage 22736 1727204281.59319: checking to see if all hosts have failed and the running result is not ok 22736 1727204281.59320: done checking to see if all hosts have failed 22736 1727204281.59321: getting the remaining hosts for this loop 22736 1727204281.59323: done getting the remaining hosts for this loop 22736 1727204281.59328: getting the next task for host managed-node2 22736 1727204281.59338: done getting next task for host managed-node2 22736 1727204281.59341: ^ task is: TASK: Set NM profile exist flag based on the profile files 22736 1727204281.59346: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204281.59350: getting variables 22736 1727204281.59352: in VariableManager get_vars() 22736 1727204281.59385: Calling all_inventory to load vars for managed-node2 22736 1727204281.59388: Calling groups_inventory to load vars for managed-node2 22736 1727204281.59405: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204281.59426: Calling all_plugins_play to load vars for managed-node2 22736 1727204281.59430: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204281.59434: Calling groups_plugins_play to load vars for managed-node2 22736 1727204281.61461: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204281.63560: done with get_vars() 22736 1727204281.63601: done getting variables 22736 1727204281.63680: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.419) 0:00:46.422 ***** 22736 1727204281.63710: entering _queue_task() for managed-node2/set_fact 22736 1727204281.64105: worker is 1 (out of 1 available) 22736 1727204281.64119: exiting _queue_task() for managed-node2/set_fact 22736 1727204281.64133: done queuing things up, now waiting for results queue to drain 22736 1727204281.64135: waiting for pending results... 22736 1727204281.64610: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files 22736 1727204281.64616: in run() - task 12b410aa-8751-4f4a-548a-000000000504 22736 1727204281.64622: variable 'ansible_search_path' from source: unknown 22736 1727204281.64632: variable 'ansible_search_path' from source: unknown 22736 1727204281.64678: calling self._execute() 22736 1727204281.64787: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204281.64816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204281.64896: variable 'omit' from source: magic vars 22736 1727204281.65232: variable 'ansible_distribution_major_version' from source: facts 22736 1727204281.65246: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204281.65354: variable 'profile_stat' from source: set_fact 22736 1727204281.65370: Evaluated conditional (profile_stat.stat.exists): False 22736 1727204281.65373: when evaluation is False, skipping this task 22736 1727204281.65376: _execute() done 22736 1727204281.65380: dumping result to json 22736 1727204281.65383: done dumping result, returning 22736 1727204281.65392: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag based on the profile files [12b410aa-8751-4f4a-548a-000000000504] 22736 1727204281.65398: sending task result for task 12b410aa-8751-4f4a-548a-000000000504 22736 1727204281.65527: done sending task result for task 12b410aa-8751-4f4a-548a-000000000504 22736 1727204281.65530: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22736 1727204281.65584: no more pending results, returning what we have 22736 1727204281.65588: results queue empty 22736 1727204281.65591: checking for any_errors_fatal 22736 1727204281.65599: done checking for any_errors_fatal 22736 1727204281.65600: checking for max_fail_percentage 22736 1727204281.65602: done checking for max_fail_percentage 22736 1727204281.65603: checking to see if all hosts have failed and the running result is not ok 22736 1727204281.65604: done checking to see if all hosts have failed 22736 1727204281.65605: getting the remaining hosts for this loop 22736 1727204281.65606: done getting the remaining hosts for this loop 22736 1727204281.65611: getting the next task for host managed-node2 22736 1727204281.65618: done getting next task for host managed-node2 22736 1727204281.65621: ^ task is: TASK: Get NM profile info 22736 1727204281.65626: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204281.65629: getting variables 22736 1727204281.65631: in VariableManager get_vars() 22736 1727204281.65657: Calling all_inventory to load vars for managed-node2 22736 1727204281.65660: Calling groups_inventory to load vars for managed-node2 22736 1727204281.65664: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204281.65676: Calling all_plugins_play to load vars for managed-node2 22736 1727204281.65679: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204281.65683: Calling groups_plugins_play to load vars for managed-node2 22736 1727204281.71091: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204281.72653: done with get_vars() 22736 1727204281.72679: done getting variables 22736 1727204281.72754: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=False, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Tuesday 24 September 2024 14:58:01 -0400 (0:00:00.090) 0:00:46.512 ***** 22736 1727204281.72777: entering _queue_task() for managed-node2/shell 22736 1727204281.72779: Creating lock for shell 22736 1727204281.73071: worker is 1 (out of 1 available) 22736 1727204281.73087: exiting _queue_task() for managed-node2/shell 22736 1727204281.73103: done queuing things up, now waiting for results queue to drain 22736 1727204281.73104: waiting for pending results... 22736 1727204281.73305: running TaskExecutor() for managed-node2/TASK: Get NM profile info 22736 1727204281.73419: in run() - task 12b410aa-8751-4f4a-548a-000000000505 22736 1727204281.73438: variable 'ansible_search_path' from source: unknown 22736 1727204281.73442: variable 'ansible_search_path' from source: unknown 22736 1727204281.73477: calling self._execute() 22736 1727204281.73557: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204281.73562: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204281.73595: variable 'omit' from source: magic vars 22736 1727204281.73909: variable 'ansible_distribution_major_version' from source: facts 22736 1727204281.73921: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204281.73929: variable 'omit' from source: magic vars 22736 1727204281.73977: variable 'omit' from source: magic vars 22736 1727204281.74070: variable 'profile' from source: include params 22736 1727204281.74074: variable 'interface' from source: set_fact 22736 1727204281.74141: variable 'interface' from source: set_fact 22736 1727204281.74159: variable 'omit' from source: magic vars 22736 1727204281.74198: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204281.74236: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204281.74255: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204281.74271: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204281.74284: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204281.74313: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204281.74317: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204281.74323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204281.74411: Set connection var ansible_timeout to 10 22736 1727204281.74424: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204281.74433: Set connection var ansible_shell_executable to /bin/sh 22736 1727204281.74437: Set connection var ansible_shell_type to sh 22736 1727204281.74448: Set connection var ansible_pipelining to False 22736 1727204281.74451: Set connection var ansible_connection to ssh 22736 1727204281.74471: variable 'ansible_shell_executable' from source: unknown 22736 1727204281.74474: variable 'ansible_connection' from source: unknown 22736 1727204281.74477: variable 'ansible_module_compression' from source: unknown 22736 1727204281.74480: variable 'ansible_shell_type' from source: unknown 22736 1727204281.74484: variable 'ansible_shell_executable' from source: unknown 22736 1727204281.74487: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204281.74495: variable 'ansible_pipelining' from source: unknown 22736 1727204281.74498: variable 'ansible_timeout' from source: unknown 22736 1727204281.74503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204281.74628: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204281.74639: variable 'omit' from source: magic vars 22736 1727204281.74644: starting attempt loop 22736 1727204281.74647: running the handler 22736 1727204281.74660: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204281.74678: _low_level_execute_command(): starting 22736 1727204281.74685: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204281.75239: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204281.75244: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.75250: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.75311: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204281.75315: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204281.75321: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204281.75363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204281.77149: stdout chunk (state=3): >>>/root <<< 22736 1727204281.77254: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204281.77321: stderr chunk (state=3): >>><<< 22736 1727204281.77325: stdout chunk (state=3): >>><<< 22736 1727204281.77348: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204281.77361: _low_level_execute_command(): starting 22736 1727204281.77368: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204281.773494-25104-176954011446984 `" && echo ansible-tmp-1727204281.773494-25104-176954011446984="` echo /root/.ansible/tmp/ansible-tmp-1727204281.773494-25104-176954011446984 `" ) && sleep 0' 22736 1727204281.77862: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204281.77873: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204281.77876: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204281.77879: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.77936: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204281.77942: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204281.77945: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204281.77986: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204281.80075: stdout chunk (state=3): >>>ansible-tmp-1727204281.773494-25104-176954011446984=/root/.ansible/tmp/ansible-tmp-1727204281.773494-25104-176954011446984 <<< 22736 1727204281.80210: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204281.80244: stderr chunk (state=3): >>><<< 22736 1727204281.80248: stdout chunk (state=3): >>><<< 22736 1727204281.80272: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204281.773494-25104-176954011446984=/root/.ansible/tmp/ansible-tmp-1727204281.773494-25104-176954011446984 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204281.80305: variable 'ansible_module_compression' from source: unknown 22736 1727204281.80355: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22736 1727204281.80398: variable 'ansible_facts' from source: unknown 22736 1727204281.80460: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204281.773494-25104-176954011446984/AnsiballZ_command.py 22736 1727204281.80584: Sending initial data 22736 1727204281.80587: Sent initial data (155 bytes) 22736 1727204281.81075: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204281.81078: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.81081: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204281.81083: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.81141: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204281.81147: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204281.81192: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204281.82936: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204281.82968: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204281.83005: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpx05r3lca /root/.ansible/tmp/ansible-tmp-1727204281.773494-25104-176954011446984/AnsiballZ_command.py <<< 22736 1727204281.83014: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204281.773494-25104-176954011446984/AnsiballZ_command.py" <<< 22736 1727204281.83044: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpx05r3lca" to remote "/root/.ansible/tmp/ansible-tmp-1727204281.773494-25104-176954011446984/AnsiballZ_command.py" <<< 22736 1727204281.83047: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204281.773494-25104-176954011446984/AnsiballZ_command.py" <<< 22736 1727204281.83830: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204281.83910: stderr chunk (state=3): >>><<< 22736 1727204281.83914: stdout chunk (state=3): >>><<< 22736 1727204281.83937: done transferring module to remote 22736 1727204281.83950: _low_level_execute_command(): starting 22736 1727204281.83959: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204281.773494-25104-176954011446984/ /root/.ansible/tmp/ansible-tmp-1727204281.773494-25104-176954011446984/AnsiballZ_command.py && sleep 0' 22736 1727204281.84447: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204281.84450: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.84453: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204281.84460: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.84522: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204281.84524: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204281.84561: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204281.86496: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204281.86563: stderr chunk (state=3): >>><<< 22736 1727204281.86567: stdout chunk (state=3): >>><<< 22736 1727204281.86580: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204281.86583: _low_level_execute_command(): starting 22736 1727204281.86591: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204281.773494-25104-176954011446984/AnsiballZ_command.py && sleep 0' 22736 1727204281.87065: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204281.87070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204281.87108: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.87112: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204281.87114: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204281.87117: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204281.87174: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204281.87177: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204281.87234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204282.06746: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-24 14:58:02.048006", "end": "2024-09-24 14:58:02.066511", "delta": "0:00:00.018505", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22736 1727204282.08533: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 22736 1727204282.08537: stdout chunk (state=3): >>><<< 22736 1727204282.08540: stderr chunk (state=3): >>><<< 22736 1727204282.08696: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "", "rc": 1, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "start": "2024-09-24 14:58:02.048006", "end": "2024-09-24 14:58:02.066511", "delta": "0:00:00.018505", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 22736 1727204282.08701: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204281.773494-25104-176954011446984/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204282.08704: _low_level_execute_command(): starting 22736 1727204282.08706: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204281.773494-25104-176954011446984/ > /dev/null 2>&1 && sleep 0' 22736 1727204282.09241: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204282.09258: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204282.09284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204282.09332: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204282.09338: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204282.09382: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204282.11372: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204282.11433: stderr chunk (state=3): >>><<< 22736 1727204282.11437: stdout chunk (state=3): >>><<< 22736 1727204282.11452: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204282.11461: handler run complete 22736 1727204282.11482: Evaluated conditional (False): False 22736 1727204282.11495: attempt loop complete, returning result 22736 1727204282.11498: _execute() done 22736 1727204282.11501: dumping result to json 22736 1727204282.11508: done dumping result, returning 22736 1727204282.11516: done running TaskExecutor() for managed-node2/TASK: Get NM profile info [12b410aa-8751-4f4a-548a-000000000505] 22736 1727204282.11521: sending task result for task 12b410aa-8751-4f4a-548a-000000000505 22736 1727204282.11633: done sending task result for task 12b410aa-8751-4f4a-548a-000000000505 22736 1727204282.11638: WORKER PROCESS EXITING fatal: [managed-node2]: FAILED! => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep lsr27 | grep /etc", "delta": "0:00:00.018505", "end": "2024-09-24 14:58:02.066511", "rc": 1, "start": "2024-09-24 14:58:02.048006" } MSG: non-zero return code ...ignoring 22736 1727204282.11729: no more pending results, returning what we have 22736 1727204282.11733: results queue empty 22736 1727204282.11734: checking for any_errors_fatal 22736 1727204282.11744: done checking for any_errors_fatal 22736 1727204282.11745: checking for max_fail_percentage 22736 1727204282.11747: done checking for max_fail_percentage 22736 1727204282.11748: checking to see if all hosts have failed and the running result is not ok 22736 1727204282.11749: done checking to see if all hosts have failed 22736 1727204282.11750: getting the remaining hosts for this loop 22736 1727204282.11752: done getting the remaining hosts for this loop 22736 1727204282.11756: getting the next task for host managed-node2 22736 1727204282.11764: done getting next task for host managed-node2 22736 1727204282.11768: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 22736 1727204282.11771: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204282.11775: getting variables 22736 1727204282.11777: in VariableManager get_vars() 22736 1727204282.11809: Calling all_inventory to load vars for managed-node2 22736 1727204282.11813: Calling groups_inventory to load vars for managed-node2 22736 1727204282.11817: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204282.11832: Calling all_plugins_play to load vars for managed-node2 22736 1727204282.11835: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204282.11839: Calling groups_plugins_play to load vars for managed-node2 22736 1727204282.13188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204282.14788: done with get_vars() 22736 1727204282.14820: done getting variables 22736 1727204282.14874: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Tuesday 24 September 2024 14:58:02 -0400 (0:00:00.421) 0:00:46.933 ***** 22736 1727204282.14906: entering _queue_task() for managed-node2/set_fact 22736 1727204282.15196: worker is 1 (out of 1 available) 22736 1727204282.15211: exiting _queue_task() for managed-node2/set_fact 22736 1727204282.15228: done queuing things up, now waiting for results queue to drain 22736 1727204282.15230: waiting for pending results... 22736 1727204282.15422: running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 22736 1727204282.15526: in run() - task 12b410aa-8751-4f4a-548a-000000000506 22736 1727204282.15539: variable 'ansible_search_path' from source: unknown 22736 1727204282.15542: variable 'ansible_search_path' from source: unknown 22736 1727204282.15580: calling self._execute() 22736 1727204282.15664: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204282.15672: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204282.15683: variable 'omit' from source: magic vars 22736 1727204282.16034: variable 'ansible_distribution_major_version' from source: facts 22736 1727204282.16045: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204282.16160: variable 'nm_profile_exists' from source: set_fact 22736 1727204282.16175: Evaluated conditional (nm_profile_exists.rc == 0): False 22736 1727204282.16179: when evaluation is False, skipping this task 22736 1727204282.16182: _execute() done 22736 1727204282.16185: dumping result to json 22736 1727204282.16191: done dumping result, returning 22736 1727204282.16199: done running TaskExecutor() for managed-node2/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12b410aa-8751-4f4a-548a-000000000506] 22736 1727204282.16204: sending task result for task 12b410aa-8751-4f4a-548a-000000000506 22736 1727204282.16309: done sending task result for task 12b410aa-8751-4f4a-548a-000000000506 22736 1727204282.16312: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "nm_profile_exists.rc == 0", "skip_reason": "Conditional result was False" } 22736 1727204282.16379: no more pending results, returning what we have 22736 1727204282.16383: results queue empty 22736 1727204282.16384: checking for any_errors_fatal 22736 1727204282.16397: done checking for any_errors_fatal 22736 1727204282.16398: checking for max_fail_percentage 22736 1727204282.16399: done checking for max_fail_percentage 22736 1727204282.16400: checking to see if all hosts have failed and the running result is not ok 22736 1727204282.16401: done checking to see if all hosts have failed 22736 1727204282.16402: getting the remaining hosts for this loop 22736 1727204282.16404: done getting the remaining hosts for this loop 22736 1727204282.16408: getting the next task for host managed-node2 22736 1727204282.16419: done getting next task for host managed-node2 22736 1727204282.16422: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 22736 1727204282.16426: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204282.16429: getting variables 22736 1727204282.16431: in VariableManager get_vars() 22736 1727204282.16461: Calling all_inventory to load vars for managed-node2 22736 1727204282.16464: Calling groups_inventory to load vars for managed-node2 22736 1727204282.16467: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204282.16480: Calling all_plugins_play to load vars for managed-node2 22736 1727204282.16483: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204282.16486: Calling groups_plugins_play to load vars for managed-node2 22736 1727204282.17726: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204282.19324: done with get_vars() 22736 1727204282.19354: done getting variables 22736 1727204282.19409: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22736 1727204282.19515: variable 'profile' from source: include params 22736 1727204282.19520: variable 'interface' from source: set_fact 22736 1727204282.19575: variable 'interface' from source: set_fact TASK [Get the ansible_managed comment in ifcfg-lsr27] ************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Tuesday 24 September 2024 14:58:02 -0400 (0:00:00.046) 0:00:46.980 ***** 22736 1727204282.19604: entering _queue_task() for managed-node2/command 22736 1727204282.19875: worker is 1 (out of 1 available) 22736 1727204282.19892: exiting _queue_task() for managed-node2/command 22736 1727204282.19905: done queuing things up, now waiting for results queue to drain 22736 1727204282.19906: waiting for pending results... 22736 1727204282.20105: running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-lsr27 22736 1727204282.20209: in run() - task 12b410aa-8751-4f4a-548a-000000000508 22736 1727204282.20223: variable 'ansible_search_path' from source: unknown 22736 1727204282.20227: variable 'ansible_search_path' from source: unknown 22736 1727204282.20263: calling self._execute() 22736 1727204282.20358: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204282.20362: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204282.20375: variable 'omit' from source: magic vars 22736 1727204282.20701: variable 'ansible_distribution_major_version' from source: facts 22736 1727204282.20712: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204282.20819: variable 'profile_stat' from source: set_fact 22736 1727204282.20835: Evaluated conditional (profile_stat.stat.exists): False 22736 1727204282.20838: when evaluation is False, skipping this task 22736 1727204282.20841: _execute() done 22736 1727204282.20845: dumping result to json 22736 1727204282.20850: done dumping result, returning 22736 1727204282.20857: done running TaskExecutor() for managed-node2/TASK: Get the ansible_managed comment in ifcfg-lsr27 [12b410aa-8751-4f4a-548a-000000000508] 22736 1727204282.20862: sending task result for task 12b410aa-8751-4f4a-548a-000000000508 22736 1727204282.20959: done sending task result for task 12b410aa-8751-4f4a-548a-000000000508 22736 1727204282.20962: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22736 1727204282.21021: no more pending results, returning what we have 22736 1727204282.21025: results queue empty 22736 1727204282.21027: checking for any_errors_fatal 22736 1727204282.21036: done checking for any_errors_fatal 22736 1727204282.21036: checking for max_fail_percentage 22736 1727204282.21038: done checking for max_fail_percentage 22736 1727204282.21039: checking to see if all hosts have failed and the running result is not ok 22736 1727204282.21040: done checking to see if all hosts have failed 22736 1727204282.21041: getting the remaining hosts for this loop 22736 1727204282.21042: done getting the remaining hosts for this loop 22736 1727204282.21047: getting the next task for host managed-node2 22736 1727204282.21055: done getting next task for host managed-node2 22736 1727204282.21058: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 22736 1727204282.21061: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204282.21066: getting variables 22736 1727204282.21068: in VariableManager get_vars() 22736 1727204282.21107: Calling all_inventory to load vars for managed-node2 22736 1727204282.21111: Calling groups_inventory to load vars for managed-node2 22736 1727204282.21114: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204282.21127: Calling all_plugins_play to load vars for managed-node2 22736 1727204282.21130: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204282.21133: Calling groups_plugins_play to load vars for managed-node2 22736 1727204282.22493: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204282.24082: done with get_vars() 22736 1727204282.24114: done getting variables 22736 1727204282.24171: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22736 1727204282.24272: variable 'profile' from source: include params 22736 1727204282.24275: variable 'interface' from source: set_fact 22736 1727204282.24325: variable 'interface' from source: set_fact TASK [Verify the ansible_managed comment in ifcfg-lsr27] *********************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Tuesday 24 September 2024 14:58:02 -0400 (0:00:00.047) 0:00:47.028 ***** 22736 1727204282.24356: entering _queue_task() for managed-node2/set_fact 22736 1727204282.24640: worker is 1 (out of 1 available) 22736 1727204282.24656: exiting _queue_task() for managed-node2/set_fact 22736 1727204282.24670: done queuing things up, now waiting for results queue to drain 22736 1727204282.24671: waiting for pending results... 22736 1727204282.24874: running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-lsr27 22736 1727204282.24983: in run() - task 12b410aa-8751-4f4a-548a-000000000509 22736 1727204282.24997: variable 'ansible_search_path' from source: unknown 22736 1727204282.25003: variable 'ansible_search_path' from source: unknown 22736 1727204282.25041: calling self._execute() 22736 1727204282.25130: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204282.25137: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204282.25148: variable 'omit' from source: magic vars 22736 1727204282.25478: variable 'ansible_distribution_major_version' from source: facts 22736 1727204282.25490: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204282.25598: variable 'profile_stat' from source: set_fact 22736 1727204282.25612: Evaluated conditional (profile_stat.stat.exists): False 22736 1727204282.25616: when evaluation is False, skipping this task 22736 1727204282.25619: _execute() done 22736 1727204282.25625: dumping result to json 22736 1727204282.25628: done dumping result, returning 22736 1727204282.25636: done running TaskExecutor() for managed-node2/TASK: Verify the ansible_managed comment in ifcfg-lsr27 [12b410aa-8751-4f4a-548a-000000000509] 22736 1727204282.25641: sending task result for task 12b410aa-8751-4f4a-548a-000000000509 22736 1727204282.25742: done sending task result for task 12b410aa-8751-4f4a-548a-000000000509 22736 1727204282.25744: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22736 1727204282.25826: no more pending results, returning what we have 22736 1727204282.25830: results queue empty 22736 1727204282.25831: checking for any_errors_fatal 22736 1727204282.25841: done checking for any_errors_fatal 22736 1727204282.25842: checking for max_fail_percentage 22736 1727204282.25844: done checking for max_fail_percentage 22736 1727204282.25845: checking to see if all hosts have failed and the running result is not ok 22736 1727204282.25846: done checking to see if all hosts have failed 22736 1727204282.25847: getting the remaining hosts for this loop 22736 1727204282.25848: done getting the remaining hosts for this loop 22736 1727204282.25855: getting the next task for host managed-node2 22736 1727204282.25862: done getting next task for host managed-node2 22736 1727204282.25865: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 22736 1727204282.25870: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204282.25873: getting variables 22736 1727204282.25875: in VariableManager get_vars() 22736 1727204282.25906: Calling all_inventory to load vars for managed-node2 22736 1727204282.25909: Calling groups_inventory to load vars for managed-node2 22736 1727204282.25912: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204282.25924: Calling all_plugins_play to load vars for managed-node2 22736 1727204282.25927: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204282.25931: Calling groups_plugins_play to load vars for managed-node2 22736 1727204282.27279: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204282.29441: done with get_vars() 22736 1727204282.29471: done getting variables 22736 1727204282.29534: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22736 1727204282.29637: variable 'profile' from source: include params 22736 1727204282.29641: variable 'interface' from source: set_fact 22736 1727204282.29691: variable 'interface' from source: set_fact TASK [Get the fingerprint comment in ifcfg-lsr27] ****************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Tuesday 24 September 2024 14:58:02 -0400 (0:00:00.053) 0:00:47.082 ***** 22736 1727204282.29718: entering _queue_task() for managed-node2/command 22736 1727204282.30002: worker is 1 (out of 1 available) 22736 1727204282.30017: exiting _queue_task() for managed-node2/command 22736 1727204282.30031: done queuing things up, now waiting for results queue to drain 22736 1727204282.30033: waiting for pending results... 22736 1727204282.30236: running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-lsr27 22736 1727204282.30330: in run() - task 12b410aa-8751-4f4a-548a-00000000050a 22736 1727204282.30344: variable 'ansible_search_path' from source: unknown 22736 1727204282.30347: variable 'ansible_search_path' from source: unknown 22736 1727204282.30382: calling self._execute() 22736 1727204282.30467: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204282.30476: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204282.30485: variable 'omit' from source: magic vars 22736 1727204282.30808: variable 'ansible_distribution_major_version' from source: facts 22736 1727204282.30817: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204282.30928: variable 'profile_stat' from source: set_fact 22736 1727204282.30942: Evaluated conditional (profile_stat.stat.exists): False 22736 1727204282.30946: when evaluation is False, skipping this task 22736 1727204282.30949: _execute() done 22736 1727204282.30954: dumping result to json 22736 1727204282.30957: done dumping result, returning 22736 1727204282.30964: done running TaskExecutor() for managed-node2/TASK: Get the fingerprint comment in ifcfg-lsr27 [12b410aa-8751-4f4a-548a-00000000050a] 22736 1727204282.30970: sending task result for task 12b410aa-8751-4f4a-548a-00000000050a 22736 1727204282.31066: done sending task result for task 12b410aa-8751-4f4a-548a-00000000050a 22736 1727204282.31069: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22736 1727204282.31128: no more pending results, returning what we have 22736 1727204282.31133: results queue empty 22736 1727204282.31134: checking for any_errors_fatal 22736 1727204282.31142: done checking for any_errors_fatal 22736 1727204282.31143: checking for max_fail_percentage 22736 1727204282.31145: done checking for max_fail_percentage 22736 1727204282.31146: checking to see if all hosts have failed and the running result is not ok 22736 1727204282.31147: done checking to see if all hosts have failed 22736 1727204282.31148: getting the remaining hosts for this loop 22736 1727204282.31151: done getting the remaining hosts for this loop 22736 1727204282.31155: getting the next task for host managed-node2 22736 1727204282.31162: done getting next task for host managed-node2 22736 1727204282.31165: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 22736 1727204282.31169: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204282.31174: getting variables 22736 1727204282.31175: in VariableManager get_vars() 22736 1727204282.31208: Calling all_inventory to load vars for managed-node2 22736 1727204282.31212: Calling groups_inventory to load vars for managed-node2 22736 1727204282.31216: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204282.31228: Calling all_plugins_play to load vars for managed-node2 22736 1727204282.31231: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204282.31235: Calling groups_plugins_play to load vars for managed-node2 22736 1727204282.34968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204282.38483: done with get_vars() 22736 1727204282.38532: done getting variables 22736 1727204282.38611: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22736 1727204282.38757: variable 'profile' from source: include params 22736 1727204282.38762: variable 'interface' from source: set_fact 22736 1727204282.38838: variable 'interface' from source: set_fact TASK [Verify the fingerprint comment in ifcfg-lsr27] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Tuesday 24 September 2024 14:58:02 -0400 (0:00:00.091) 0:00:47.173 ***** 22736 1727204282.38876: entering _queue_task() for managed-node2/set_fact 22736 1727204282.39257: worker is 1 (out of 1 available) 22736 1727204282.39273: exiting _queue_task() for managed-node2/set_fact 22736 1727204282.39287: done queuing things up, now waiting for results queue to drain 22736 1727204282.39291: waiting for pending results... 22736 1727204282.39580: running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-lsr27 22736 1727204282.39679: in run() - task 12b410aa-8751-4f4a-548a-00000000050b 22736 1727204282.39695: variable 'ansible_search_path' from source: unknown 22736 1727204282.39699: variable 'ansible_search_path' from source: unknown 22736 1727204282.39733: calling self._execute() 22736 1727204282.39813: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204282.39823: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204282.39832: variable 'omit' from source: magic vars 22736 1727204282.40154: variable 'ansible_distribution_major_version' from source: facts 22736 1727204282.40166: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204282.40274: variable 'profile_stat' from source: set_fact 22736 1727204282.40288: Evaluated conditional (profile_stat.stat.exists): False 22736 1727204282.40294: when evaluation is False, skipping this task 22736 1727204282.40298: _execute() done 22736 1727204282.40308: dumping result to json 22736 1727204282.40311: done dumping result, returning 22736 1727204282.40314: done running TaskExecutor() for managed-node2/TASK: Verify the fingerprint comment in ifcfg-lsr27 [12b410aa-8751-4f4a-548a-00000000050b] 22736 1727204282.40322: sending task result for task 12b410aa-8751-4f4a-548a-00000000050b 22736 1727204282.40417: done sending task result for task 12b410aa-8751-4f4a-548a-00000000050b 22736 1727204282.40422: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 22736 1727204282.40474: no more pending results, returning what we have 22736 1727204282.40478: results queue empty 22736 1727204282.40479: checking for any_errors_fatal 22736 1727204282.40487: done checking for any_errors_fatal 22736 1727204282.40488: checking for max_fail_percentage 22736 1727204282.40492: done checking for max_fail_percentage 22736 1727204282.40493: checking to see if all hosts have failed and the running result is not ok 22736 1727204282.40494: done checking to see if all hosts have failed 22736 1727204282.40495: getting the remaining hosts for this loop 22736 1727204282.40496: done getting the remaining hosts for this loop 22736 1727204282.40501: getting the next task for host managed-node2 22736 1727204282.40510: done getting next task for host managed-node2 22736 1727204282.40514: ^ task is: TASK: Assert that the profile is absent - '{{ profile }}' 22736 1727204282.40519: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204282.40525: getting variables 22736 1727204282.40526: in VariableManager get_vars() 22736 1727204282.40559: Calling all_inventory to load vars for managed-node2 22736 1727204282.40562: Calling groups_inventory to load vars for managed-node2 22736 1727204282.40566: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204282.40579: Calling all_plugins_play to load vars for managed-node2 22736 1727204282.40582: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204282.40585: Calling groups_plugins_play to load vars for managed-node2 22736 1727204282.42431: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204282.44262: done with get_vars() 22736 1727204282.44294: done getting variables 22736 1727204282.44350: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22736 1727204282.44454: variable 'profile' from source: include params 22736 1727204282.44457: variable 'interface' from source: set_fact 22736 1727204282.44512: variable 'interface' from source: set_fact TASK [Assert that the profile is absent - 'lsr27'] ***************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_absent.yml:5 Tuesday 24 September 2024 14:58:02 -0400 (0:00:00.056) 0:00:47.230 ***** 22736 1727204282.44550: entering _queue_task() for managed-node2/assert 22736 1727204282.44939: worker is 1 (out of 1 available) 22736 1727204282.44953: exiting _queue_task() for managed-node2/assert 22736 1727204282.44967: done queuing things up, now waiting for results queue to drain 22736 1727204282.44968: waiting for pending results... 22736 1727204282.45409: running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'lsr27' 22736 1727204282.45415: in run() - task 12b410aa-8751-4f4a-548a-0000000004f6 22736 1727204282.45438: variable 'ansible_search_path' from source: unknown 22736 1727204282.45446: variable 'ansible_search_path' from source: unknown 22736 1727204282.45500: calling self._execute() 22736 1727204282.45621: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204282.45639: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204282.45654: variable 'omit' from source: magic vars 22736 1727204282.46299: variable 'ansible_distribution_major_version' from source: facts 22736 1727204282.46304: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204282.46306: variable 'omit' from source: magic vars 22736 1727204282.46308: variable 'omit' from source: magic vars 22736 1727204282.46387: variable 'profile' from source: include params 22736 1727204282.46404: variable 'interface' from source: set_fact 22736 1727204282.46495: variable 'interface' from source: set_fact 22736 1727204282.46534: variable 'omit' from source: magic vars 22736 1727204282.46586: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204282.46645: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204282.46675: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204282.46705: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204282.46730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204282.46777: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204282.46848: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204282.46851: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204282.46949: Set connection var ansible_timeout to 10 22736 1727204282.46961: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204282.46973: Set connection var ansible_shell_executable to /bin/sh 22736 1727204282.46976: Set connection var ansible_shell_type to sh 22736 1727204282.46982: Set connection var ansible_pipelining to False 22736 1727204282.46986: Set connection var ansible_connection to ssh 22736 1727204282.47009: variable 'ansible_shell_executable' from source: unknown 22736 1727204282.47012: variable 'ansible_connection' from source: unknown 22736 1727204282.47015: variable 'ansible_module_compression' from source: unknown 22736 1727204282.47022: variable 'ansible_shell_type' from source: unknown 22736 1727204282.47024: variable 'ansible_shell_executable' from source: unknown 22736 1727204282.47027: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204282.47031: variable 'ansible_pipelining' from source: unknown 22736 1727204282.47034: variable 'ansible_timeout' from source: unknown 22736 1727204282.47040: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204282.47174: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204282.47187: variable 'omit' from source: magic vars 22736 1727204282.47194: starting attempt loop 22736 1727204282.47197: running the handler 22736 1727204282.47306: variable 'lsr_net_profile_exists' from source: set_fact 22736 1727204282.47312: Evaluated conditional (not lsr_net_profile_exists): True 22736 1727204282.47322: handler run complete 22736 1727204282.47334: attempt loop complete, returning result 22736 1727204282.47337: _execute() done 22736 1727204282.47340: dumping result to json 22736 1727204282.47345: done dumping result, returning 22736 1727204282.47353: done running TaskExecutor() for managed-node2/TASK: Assert that the profile is absent - 'lsr27' [12b410aa-8751-4f4a-548a-0000000004f6] 22736 1727204282.47357: sending task result for task 12b410aa-8751-4f4a-548a-0000000004f6 22736 1727204282.47454: done sending task result for task 12b410aa-8751-4f4a-548a-0000000004f6 22736 1727204282.47458: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 22736 1727204282.47522: no more pending results, returning what we have 22736 1727204282.47525: results queue empty 22736 1727204282.47527: checking for any_errors_fatal 22736 1727204282.47534: done checking for any_errors_fatal 22736 1727204282.47535: checking for max_fail_percentage 22736 1727204282.47536: done checking for max_fail_percentage 22736 1727204282.47538: checking to see if all hosts have failed and the running result is not ok 22736 1727204282.47539: done checking to see if all hosts have failed 22736 1727204282.47540: getting the remaining hosts for this loop 22736 1727204282.47541: done getting the remaining hosts for this loop 22736 1727204282.47546: getting the next task for host managed-node2 22736 1727204282.47556: done getting next task for host managed-node2 22736 1727204282.47560: ^ task is: TASK: Include the task 'assert_device_absent.yml' 22736 1727204282.47562: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204282.47574: getting variables 22736 1727204282.47576: in VariableManager get_vars() 22736 1727204282.47609: Calling all_inventory to load vars for managed-node2 22736 1727204282.47612: Calling groups_inventory to load vars for managed-node2 22736 1727204282.47616: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204282.47631: Calling all_plugins_play to load vars for managed-node2 22736 1727204282.47634: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204282.47637: Calling groups_plugins_play to load vars for managed-node2 22736 1727204282.49218: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204282.51127: done with get_vars() 22736 1727204282.51164: done getting variables TASK [Include the task 'assert_device_absent.yml'] ***************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:75 Tuesday 24 September 2024 14:58:02 -0400 (0:00:00.066) 0:00:47.297 ***** 22736 1727204282.51250: entering _queue_task() for managed-node2/include_tasks 22736 1727204282.51537: worker is 1 (out of 1 available) 22736 1727204282.51554: exiting _queue_task() for managed-node2/include_tasks 22736 1727204282.51567: done queuing things up, now waiting for results queue to drain 22736 1727204282.51568: waiting for pending results... 22736 1727204282.51774: running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_absent.yml' 22736 1727204282.51851: in run() - task 12b410aa-8751-4f4a-548a-000000000075 22736 1727204282.51864: variable 'ansible_search_path' from source: unknown 22736 1727204282.51903: calling self._execute() 22736 1727204282.51993: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204282.52000: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204282.52011: variable 'omit' from source: magic vars 22736 1727204282.52344: variable 'ansible_distribution_major_version' from source: facts 22736 1727204282.52359: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204282.52363: _execute() done 22736 1727204282.52367: dumping result to json 22736 1727204282.52372: done dumping result, returning 22736 1727204282.52379: done running TaskExecutor() for managed-node2/TASK: Include the task 'assert_device_absent.yml' [12b410aa-8751-4f4a-548a-000000000075] 22736 1727204282.52385: sending task result for task 12b410aa-8751-4f4a-548a-000000000075 22736 1727204282.52486: done sending task result for task 12b410aa-8751-4f4a-548a-000000000075 22736 1727204282.52490: WORKER PROCESS EXITING 22736 1727204282.52520: no more pending results, returning what we have 22736 1727204282.52526: in VariableManager get_vars() 22736 1727204282.52565: Calling all_inventory to load vars for managed-node2 22736 1727204282.52568: Calling groups_inventory to load vars for managed-node2 22736 1727204282.52572: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204282.52587: Calling all_plugins_play to load vars for managed-node2 22736 1727204282.52593: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204282.52597: Calling groups_plugins_play to load vars for managed-node2 22736 1727204282.53878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204282.55461: done with get_vars() 22736 1727204282.55486: variable 'ansible_search_path' from source: unknown 22736 1727204282.55503: we have included files to process 22736 1727204282.55504: generating all_blocks data 22736 1727204282.55505: done generating all_blocks data 22736 1727204282.55510: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 22736 1727204282.55511: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 22736 1727204282.55513: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml 22736 1727204282.55659: in VariableManager get_vars() 22736 1727204282.55673: done with get_vars() 22736 1727204282.55770: done processing included file 22736 1727204282.55772: iterating over new_blocks loaded from include file 22736 1727204282.55774: in VariableManager get_vars() 22736 1727204282.55782: done with get_vars() 22736 1727204282.55784: filtering new block on tags 22736 1727204282.55800: done filtering new block on tags 22736 1727204282.55802: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml for managed-node2 22736 1727204282.55807: extending task lists for all hosts with included blocks 22736 1727204282.55927: done extending task lists 22736 1727204282.55928: done processing included files 22736 1727204282.55929: results queue empty 22736 1727204282.55929: checking for any_errors_fatal 22736 1727204282.55933: done checking for any_errors_fatal 22736 1727204282.55933: checking for max_fail_percentage 22736 1727204282.55934: done checking for max_fail_percentage 22736 1727204282.55935: checking to see if all hosts have failed and the running result is not ok 22736 1727204282.55935: done checking to see if all hosts have failed 22736 1727204282.55936: getting the remaining hosts for this loop 22736 1727204282.55937: done getting the remaining hosts for this loop 22736 1727204282.55939: getting the next task for host managed-node2 22736 1727204282.55942: done getting next task for host managed-node2 22736 1727204282.55943: ^ task is: TASK: Include the task 'get_interface_stat.yml' 22736 1727204282.55945: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204282.55947: getting variables 22736 1727204282.55948: in VariableManager get_vars() 22736 1727204282.55957: Calling all_inventory to load vars for managed-node2 22736 1727204282.55959: Calling groups_inventory to load vars for managed-node2 22736 1727204282.55961: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204282.55966: Calling all_plugins_play to load vars for managed-node2 22736 1727204282.55969: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204282.55971: Calling groups_plugins_play to load vars for managed-node2 22736 1727204282.57148: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204282.58721: done with get_vars() 22736 1727204282.58751: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:3 Tuesday 24 September 2024 14:58:02 -0400 (0:00:00.075) 0:00:47.373 ***** 22736 1727204282.58828: entering _queue_task() for managed-node2/include_tasks 22736 1727204282.59110: worker is 1 (out of 1 available) 22736 1727204282.59124: exiting _queue_task() for managed-node2/include_tasks 22736 1727204282.59139: done queuing things up, now waiting for results queue to drain 22736 1727204282.59141: waiting for pending results... 22736 1727204282.59342: running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' 22736 1727204282.59438: in run() - task 12b410aa-8751-4f4a-548a-00000000053c 22736 1727204282.59451: variable 'ansible_search_path' from source: unknown 22736 1727204282.59455: variable 'ansible_search_path' from source: unknown 22736 1727204282.59491: calling self._execute() 22736 1727204282.59582: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204282.59587: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204282.59603: variable 'omit' from source: magic vars 22736 1727204282.59934: variable 'ansible_distribution_major_version' from source: facts 22736 1727204282.59947: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204282.59954: _execute() done 22736 1727204282.59958: dumping result to json 22736 1727204282.59963: done dumping result, returning 22736 1727204282.59970: done running TaskExecutor() for managed-node2/TASK: Include the task 'get_interface_stat.yml' [12b410aa-8751-4f4a-548a-00000000053c] 22736 1727204282.59975: sending task result for task 12b410aa-8751-4f4a-548a-00000000053c 22736 1727204282.60075: done sending task result for task 12b410aa-8751-4f4a-548a-00000000053c 22736 1727204282.60078: WORKER PROCESS EXITING 22736 1727204282.60114: no more pending results, returning what we have 22736 1727204282.60120: in VariableManager get_vars() 22736 1727204282.60161: Calling all_inventory to load vars for managed-node2 22736 1727204282.60164: Calling groups_inventory to load vars for managed-node2 22736 1727204282.60169: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204282.60185: Calling all_plugins_play to load vars for managed-node2 22736 1727204282.60191: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204282.60195: Calling groups_plugins_play to load vars for managed-node2 22736 1727204282.61581: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204282.63154: done with get_vars() 22736 1727204282.63180: variable 'ansible_search_path' from source: unknown 22736 1727204282.63181: variable 'ansible_search_path' from source: unknown 22736 1727204282.63219: we have included files to process 22736 1727204282.63220: generating all_blocks data 22736 1727204282.63222: done generating all_blocks data 22736 1727204282.63224: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22736 1727204282.63225: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22736 1727204282.63227: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 22736 1727204282.63393: done processing included file 22736 1727204282.63395: iterating over new_blocks loaded from include file 22736 1727204282.63396: in VariableManager get_vars() 22736 1727204282.63407: done with get_vars() 22736 1727204282.63408: filtering new block on tags 22736 1727204282.63422: done filtering new block on tags 22736 1727204282.63424: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed-node2 22736 1727204282.63428: extending task lists for all hosts with included blocks 22736 1727204282.63515: done extending task lists 22736 1727204282.63516: done processing included files 22736 1727204282.63516: results queue empty 22736 1727204282.63517: checking for any_errors_fatal 22736 1727204282.63520: done checking for any_errors_fatal 22736 1727204282.63521: checking for max_fail_percentage 22736 1727204282.63522: done checking for max_fail_percentage 22736 1727204282.63522: checking to see if all hosts have failed and the running result is not ok 22736 1727204282.63523: done checking to see if all hosts have failed 22736 1727204282.63523: getting the remaining hosts for this loop 22736 1727204282.63524: done getting the remaining hosts for this loop 22736 1727204282.63526: getting the next task for host managed-node2 22736 1727204282.63530: done getting next task for host managed-node2 22736 1727204282.63531: ^ task is: TASK: Get stat for interface {{ interface }} 22736 1727204282.63534: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204282.63535: getting variables 22736 1727204282.63536: in VariableManager get_vars() 22736 1727204282.63543: Calling all_inventory to load vars for managed-node2 22736 1727204282.63545: Calling groups_inventory to load vars for managed-node2 22736 1727204282.63547: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204282.63553: Calling all_plugins_play to load vars for managed-node2 22736 1727204282.63556: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204282.63559: Calling groups_plugins_play to load vars for managed-node2 22736 1727204282.64670: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204282.66235: done with get_vars() 22736 1727204282.66264: done getting variables 22736 1727204282.66411: variable 'interface' from source: set_fact TASK [Get stat for interface lsr27] ******************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Tuesday 24 September 2024 14:58:02 -0400 (0:00:00.076) 0:00:47.449 ***** 22736 1727204282.66438: entering _queue_task() for managed-node2/stat 22736 1727204282.66720: worker is 1 (out of 1 available) 22736 1727204282.66735: exiting _queue_task() for managed-node2/stat 22736 1727204282.66747: done queuing things up, now waiting for results queue to drain 22736 1727204282.66749: waiting for pending results... 22736 1727204282.66948: running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr27 22736 1727204282.67048: in run() - task 12b410aa-8751-4f4a-548a-000000000554 22736 1727204282.67060: variable 'ansible_search_path' from source: unknown 22736 1727204282.67064: variable 'ansible_search_path' from source: unknown 22736 1727204282.67100: calling self._execute() 22736 1727204282.67184: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204282.67190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204282.67209: variable 'omit' from source: magic vars 22736 1727204282.67520: variable 'ansible_distribution_major_version' from source: facts 22736 1727204282.67538: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204282.67542: variable 'omit' from source: magic vars 22736 1727204282.67583: variable 'omit' from source: magic vars 22736 1727204282.67683: variable 'interface' from source: set_fact 22736 1727204282.67712: variable 'omit' from source: magic vars 22736 1727204282.67753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204282.67791: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204282.67810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204282.67828: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204282.67840: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204282.67868: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204282.67879: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204282.67882: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204282.67975: Set connection var ansible_timeout to 10 22736 1727204282.67992: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204282.68001: Set connection var ansible_shell_executable to /bin/sh 22736 1727204282.68004: Set connection var ansible_shell_type to sh 22736 1727204282.68011: Set connection var ansible_pipelining to False 22736 1727204282.68014: Set connection var ansible_connection to ssh 22736 1727204282.68036: variable 'ansible_shell_executable' from source: unknown 22736 1727204282.68040: variable 'ansible_connection' from source: unknown 22736 1727204282.68043: variable 'ansible_module_compression' from source: unknown 22736 1727204282.68046: variable 'ansible_shell_type' from source: unknown 22736 1727204282.68049: variable 'ansible_shell_executable' from source: unknown 22736 1727204282.68054: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204282.68059: variable 'ansible_pipelining' from source: unknown 22736 1727204282.68063: variable 'ansible_timeout' from source: unknown 22736 1727204282.68069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204282.68250: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) 22736 1727204282.68261: variable 'omit' from source: magic vars 22736 1727204282.68267: starting attempt loop 22736 1727204282.68270: running the handler 22736 1727204282.68285: _low_level_execute_command(): starting 22736 1727204282.68293: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204282.68842: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204282.68847: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 22736 1727204282.68852: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204282.68916: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204282.68924: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204282.68926: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204282.68970: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204282.70793: stdout chunk (state=3): >>>/root <<< 22736 1727204282.70901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204282.70963: stderr chunk (state=3): >>><<< 22736 1727204282.70967: stdout chunk (state=3): >>><<< 22736 1727204282.70999: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204282.71010: _low_level_execute_command(): starting 22736 1727204282.71019: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204282.7099109-25141-271254854443722 `" && echo ansible-tmp-1727204282.7099109-25141-271254854443722="` echo /root/.ansible/tmp/ansible-tmp-1727204282.7099109-25141-271254854443722 `" ) && sleep 0' 22736 1727204282.71496: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204282.71500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204282.71505: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 22736 1727204282.71528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204282.71578: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204282.71586: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204282.71629: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204282.73751: stdout chunk (state=3): >>>ansible-tmp-1727204282.7099109-25141-271254854443722=/root/.ansible/tmp/ansible-tmp-1727204282.7099109-25141-271254854443722 <<< 22736 1727204282.73961: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204282.73965: stdout chunk (state=3): >>><<< 22736 1727204282.73968: stderr chunk (state=3): >>><<< 22736 1727204282.74197: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204282.7099109-25141-271254854443722=/root/.ansible/tmp/ansible-tmp-1727204282.7099109-25141-271254854443722 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204282.74201: variable 'ansible_module_compression' from source: unknown 22736 1727204282.74203: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 22736 1727204282.74205: variable 'ansible_facts' from source: unknown 22736 1727204282.74282: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204282.7099109-25141-271254854443722/AnsiballZ_stat.py 22736 1727204282.74452: Sending initial data 22736 1727204282.74552: Sent initial data (153 bytes) 22736 1727204282.75164: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204282.75210: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204282.75322: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204282.75347: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204282.75428: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204282.77169: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 22736 1727204282.77210: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204282.77248: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204282.77324: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpuddmcp3l /root/.ansible/tmp/ansible-tmp-1727204282.7099109-25141-271254854443722/AnsiballZ_stat.py <<< 22736 1727204282.77338: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204282.7099109-25141-271254854443722/AnsiballZ_stat.py" <<< 22736 1727204282.77387: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpuddmcp3l" to remote "/root/.ansible/tmp/ansible-tmp-1727204282.7099109-25141-271254854443722/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204282.7099109-25141-271254854443722/AnsiballZ_stat.py" <<< 22736 1727204282.78650: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204282.78814: stderr chunk (state=3): >>><<< 22736 1727204282.78820: stdout chunk (state=3): >>><<< 22736 1727204282.78823: done transferring module to remote 22736 1727204282.78826: _low_level_execute_command(): starting 22736 1727204282.78828: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204282.7099109-25141-271254854443722/ /root/.ansible/tmp/ansible-tmp-1727204282.7099109-25141-271254854443722/AnsiballZ_stat.py && sleep 0' 22736 1727204282.79515: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204282.79566: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204282.79583: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204282.79612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204282.79696: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204282.81730: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204282.81809: stderr chunk (state=3): >>><<< 22736 1727204282.81813: stdout chunk (state=3): >>><<< 22736 1727204282.81942: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204282.81948: _low_level_execute_command(): starting 22736 1727204282.81953: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204282.7099109-25141-271254854443722/AnsiballZ_stat.py && sleep 0' 22736 1727204282.82616: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204282.82624: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204282.82856: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204282.82885: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204283.00944: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} <<< 22736 1727204283.02442: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204283.02503: stderr chunk (state=3): >>><<< 22736 1727204283.02508: stdout chunk (state=3): >>><<< 22736 1727204283.02529: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/lsr27", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204283.02556: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/lsr27', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204282.7099109-25141-271254854443722/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204283.02566: _low_level_execute_command(): starting 22736 1727204283.02572: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204282.7099109-25141-271254854443722/ > /dev/null 2>&1 && sleep 0' 22736 1727204283.03071: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204283.03074: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204283.03077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204283.03079: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204283.03139: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204283.03142: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204283.03144: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204283.03187: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204283.05206: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204283.05269: stderr chunk (state=3): >>><<< 22736 1727204283.05273: stdout chunk (state=3): >>><<< 22736 1727204283.05290: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204283.05300: handler run complete 22736 1727204283.05326: attempt loop complete, returning result 22736 1727204283.05330: _execute() done 22736 1727204283.05332: dumping result to json 22736 1727204283.05338: done dumping result, returning 22736 1727204283.05348: done running TaskExecutor() for managed-node2/TASK: Get stat for interface lsr27 [12b410aa-8751-4f4a-548a-000000000554] 22736 1727204283.05353: sending task result for task 12b410aa-8751-4f4a-548a-000000000554 22736 1727204283.05457: done sending task result for task 12b410aa-8751-4f4a-548a-000000000554 22736 1727204283.05460: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "stat": { "exists": false } } 22736 1727204283.05534: no more pending results, returning what we have 22736 1727204283.05539: results queue empty 22736 1727204283.05540: checking for any_errors_fatal 22736 1727204283.05542: done checking for any_errors_fatal 22736 1727204283.05543: checking for max_fail_percentage 22736 1727204283.05544: done checking for max_fail_percentage 22736 1727204283.05546: checking to see if all hosts have failed and the running result is not ok 22736 1727204283.05547: done checking to see if all hosts have failed 22736 1727204283.05548: getting the remaining hosts for this loop 22736 1727204283.05550: done getting the remaining hosts for this loop 22736 1727204283.05555: getting the next task for host managed-node2 22736 1727204283.05564: done getting next task for host managed-node2 22736 1727204283.05567: ^ task is: TASK: Assert that the interface is absent - '{{ interface }}' 22736 1727204283.05572: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204283.05577: getting variables 22736 1727204283.05579: in VariableManager get_vars() 22736 1727204283.05612: Calling all_inventory to load vars for managed-node2 22736 1727204283.05615: Calling groups_inventory to load vars for managed-node2 22736 1727204283.05621: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204283.05634: Calling all_plugins_play to load vars for managed-node2 22736 1727204283.05637: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204283.05641: Calling groups_plugins_play to load vars for managed-node2 22736 1727204283.06965: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204283.08563: done with get_vars() 22736 1727204283.08596: done getting variables 22736 1727204283.08655: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) 22736 1727204283.08763: variable 'interface' from source: set_fact TASK [Assert that the interface is absent - 'lsr27'] *************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_absent.yml:5 Tuesday 24 September 2024 14:58:03 -0400 (0:00:00.423) 0:00:47.872 ***** 22736 1727204283.08792: entering _queue_task() for managed-node2/assert 22736 1727204283.09080: worker is 1 (out of 1 available) 22736 1727204283.09098: exiting _queue_task() for managed-node2/assert 22736 1727204283.09111: done queuing things up, now waiting for results queue to drain 22736 1727204283.09113: waiting for pending results... 22736 1727204283.09313: running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'lsr27' 22736 1727204283.09399: in run() - task 12b410aa-8751-4f4a-548a-00000000053d 22736 1727204283.09414: variable 'ansible_search_path' from source: unknown 22736 1727204283.09421: variable 'ansible_search_path' from source: unknown 22736 1727204283.09455: calling self._execute() 22736 1727204283.09537: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204283.09544: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204283.09557: variable 'omit' from source: magic vars 22736 1727204283.09877: variable 'ansible_distribution_major_version' from source: facts 22736 1727204283.09894: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204283.09900: variable 'omit' from source: magic vars 22736 1727204283.09938: variable 'omit' from source: magic vars 22736 1727204283.10031: variable 'interface' from source: set_fact 22736 1727204283.10048: variable 'omit' from source: magic vars 22736 1727204283.10087: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204283.10129: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204283.10146: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204283.10162: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204283.10175: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204283.10205: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204283.10211: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204283.10215: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204283.10301: Set connection var ansible_timeout to 10 22736 1727204283.10312: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204283.10323: Set connection var ansible_shell_executable to /bin/sh 22736 1727204283.10325: Set connection var ansible_shell_type to sh 22736 1727204283.10335: Set connection var ansible_pipelining to False 22736 1727204283.10337: Set connection var ansible_connection to ssh 22736 1727204283.10359: variable 'ansible_shell_executable' from source: unknown 22736 1727204283.10363: variable 'ansible_connection' from source: unknown 22736 1727204283.10366: variable 'ansible_module_compression' from source: unknown 22736 1727204283.10370: variable 'ansible_shell_type' from source: unknown 22736 1727204283.10372: variable 'ansible_shell_executable' from source: unknown 22736 1727204283.10377: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204283.10382: variable 'ansible_pipelining' from source: unknown 22736 1727204283.10386: variable 'ansible_timeout' from source: unknown 22736 1727204283.10392: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204283.10520: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204283.10529: variable 'omit' from source: magic vars 22736 1727204283.10535: starting attempt loop 22736 1727204283.10538: running the handler 22736 1727204283.10673: variable 'interface_stat' from source: set_fact 22736 1727204283.10682: Evaluated conditional (not interface_stat.stat.exists): True 22736 1727204283.10691: handler run complete 22736 1727204283.10706: attempt loop complete, returning result 22736 1727204283.10709: _execute() done 22736 1727204283.10712: dumping result to json 22736 1727204283.10720: done dumping result, returning 22736 1727204283.10726: done running TaskExecutor() for managed-node2/TASK: Assert that the interface is absent - 'lsr27' [12b410aa-8751-4f4a-548a-00000000053d] 22736 1727204283.10731: sending task result for task 12b410aa-8751-4f4a-548a-00000000053d 22736 1727204283.10829: done sending task result for task 12b410aa-8751-4f4a-548a-00000000053d 22736 1727204283.10833: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false } MSG: All assertions passed 22736 1727204283.10888: no more pending results, returning what we have 22736 1727204283.10894: results queue empty 22736 1727204283.10895: checking for any_errors_fatal 22736 1727204283.10907: done checking for any_errors_fatal 22736 1727204283.10907: checking for max_fail_percentage 22736 1727204283.10909: done checking for max_fail_percentage 22736 1727204283.10910: checking to see if all hosts have failed and the running result is not ok 22736 1727204283.10911: done checking to see if all hosts have failed 22736 1727204283.10912: getting the remaining hosts for this loop 22736 1727204283.10914: done getting the remaining hosts for this loop 22736 1727204283.10921: getting the next task for host managed-node2 22736 1727204283.10932: done getting next task for host managed-node2 22736 1727204283.10934: ^ task is: TASK: meta (flush_handlers) 22736 1727204283.10937: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204283.10941: getting variables 22736 1727204283.10943: in VariableManager get_vars() 22736 1727204283.10976: Calling all_inventory to load vars for managed-node2 22736 1727204283.10980: Calling groups_inventory to load vars for managed-node2 22736 1727204283.10985: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204283.11005: Calling all_plugins_play to load vars for managed-node2 22736 1727204283.11010: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204283.11013: Calling groups_plugins_play to load vars for managed-node2 22736 1727204283.12400: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204283.14562: done with get_vars() 22736 1727204283.14597: done getting variables 22736 1727204283.14662: in VariableManager get_vars() 22736 1727204283.14671: Calling all_inventory to load vars for managed-node2 22736 1727204283.14673: Calling groups_inventory to load vars for managed-node2 22736 1727204283.14675: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204283.14680: Calling all_plugins_play to load vars for managed-node2 22736 1727204283.14681: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204283.14684: Calling groups_plugins_play to load vars for managed-node2 22736 1727204283.15772: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204283.18565: done with get_vars() 22736 1727204283.18622: done queuing things up, now waiting for results queue to drain 22736 1727204283.18624: results queue empty 22736 1727204283.18626: checking for any_errors_fatal 22736 1727204283.18630: done checking for any_errors_fatal 22736 1727204283.18631: checking for max_fail_percentage 22736 1727204283.18632: done checking for max_fail_percentage 22736 1727204283.18633: checking to see if all hosts have failed and the running result is not ok 22736 1727204283.18634: done checking to see if all hosts have failed 22736 1727204283.18642: getting the remaining hosts for this loop 22736 1727204283.18644: done getting the remaining hosts for this loop 22736 1727204283.18647: getting the next task for host managed-node2 22736 1727204283.18652: done getting next task for host managed-node2 22736 1727204283.18654: ^ task is: TASK: meta (flush_handlers) 22736 1727204283.18656: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204283.18659: getting variables 22736 1727204283.18661: in VariableManager get_vars() 22736 1727204283.18672: Calling all_inventory to load vars for managed-node2 22736 1727204283.18675: Calling groups_inventory to load vars for managed-node2 22736 1727204283.18678: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204283.18686: Calling all_plugins_play to load vars for managed-node2 22736 1727204283.18691: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204283.18695: Calling groups_plugins_play to load vars for managed-node2 22736 1727204283.20778: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204283.23644: done with get_vars() 22736 1727204283.23698: done getting variables 22736 1727204283.23766: in VariableManager get_vars() 22736 1727204283.23781: Calling all_inventory to load vars for managed-node2 22736 1727204283.23784: Calling groups_inventory to load vars for managed-node2 22736 1727204283.23787: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204283.23796: Calling all_plugins_play to load vars for managed-node2 22736 1727204283.23799: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204283.23803: Calling groups_plugins_play to load vars for managed-node2 22736 1727204283.30394: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204283.33207: done with get_vars() 22736 1727204283.33263: done queuing things up, now waiting for results queue to drain 22736 1727204283.33266: results queue empty 22736 1727204283.33267: checking for any_errors_fatal 22736 1727204283.33269: done checking for any_errors_fatal 22736 1727204283.33270: checking for max_fail_percentage 22736 1727204283.33271: done checking for max_fail_percentage 22736 1727204283.33272: checking to see if all hosts have failed and the running result is not ok 22736 1727204283.33273: done checking to see if all hosts have failed 22736 1727204283.33273: getting the remaining hosts for this loop 22736 1727204283.33274: done getting the remaining hosts for this loop 22736 1727204283.33278: getting the next task for host managed-node2 22736 1727204283.33281: done getting next task for host managed-node2 22736 1727204283.33282: ^ task is: None 22736 1727204283.33284: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204283.33285: done queuing things up, now waiting for results queue to drain 22736 1727204283.33286: results queue empty 22736 1727204283.33287: checking for any_errors_fatal 22736 1727204283.33288: done checking for any_errors_fatal 22736 1727204283.33288: checking for max_fail_percentage 22736 1727204283.33292: done checking for max_fail_percentage 22736 1727204283.33293: checking to see if all hosts have failed and the running result is not ok 22736 1727204283.33294: done checking to see if all hosts have failed 22736 1727204283.33295: getting the next task for host managed-node2 22736 1727204283.33298: done getting next task for host managed-node2 22736 1727204283.33299: ^ task is: None 22736 1727204283.33301: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204283.33335: in VariableManager get_vars() 22736 1727204283.33352: done with get_vars() 22736 1727204283.33358: in VariableManager get_vars() 22736 1727204283.33367: done with get_vars() 22736 1727204283.33371: variable 'omit' from source: magic vars 22736 1727204283.33404: in VariableManager get_vars() 22736 1727204283.33414: done with get_vars() 22736 1727204283.33435: variable 'omit' from source: magic vars PLAY [Verify that cleanup restored state to default] *************************** 22736 1727204283.33740: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 22736 1727204283.33762: getting the remaining hosts for this loop 22736 1727204283.33764: done getting the remaining hosts for this loop 22736 1727204283.33767: getting the next task for host managed-node2 22736 1727204283.33769: done getting next task for host managed-node2 22736 1727204283.33772: ^ task is: TASK: Gathering Facts 22736 1727204283.33773: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204283.33775: getting variables 22736 1727204283.33776: in VariableManager get_vars() 22736 1727204283.33786: Calling all_inventory to load vars for managed-node2 22736 1727204283.33791: Calling groups_inventory to load vars for managed-node2 22736 1727204283.33794: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204283.33800: Calling all_plugins_play to load vars for managed-node2 22736 1727204283.33803: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204283.33807: Calling groups_plugins_play to load vars for managed-node2 22736 1727204283.35830: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204283.38970: done with get_vars() 22736 1727204283.39017: done getting variables 22736 1727204283.39080: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Tuesday 24 September 2024 14:58:03 -0400 (0:00:00.303) 0:00:48.176 ***** 22736 1727204283.39114: entering _queue_task() for managed-node2/gather_facts 22736 1727204283.39490: worker is 1 (out of 1 available) 22736 1727204283.39504: exiting _queue_task() for managed-node2/gather_facts 22736 1727204283.39518: done queuing things up, now waiting for results queue to drain 22736 1727204283.39520: waiting for pending results... 22736 1727204283.39841: running TaskExecutor() for managed-node2/TASK: Gathering Facts 22736 1727204283.40020: in run() - task 12b410aa-8751-4f4a-548a-00000000056d 22736 1727204283.40025: variable 'ansible_search_path' from source: unknown 22736 1727204283.40051: calling self._execute() 22736 1727204283.40171: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204283.40236: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204283.40240: variable 'omit' from source: magic vars 22736 1727204283.40696: variable 'ansible_distribution_major_version' from source: facts 22736 1727204283.40715: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204283.40728: variable 'omit' from source: magic vars 22736 1727204283.40768: variable 'omit' from source: magic vars 22736 1727204283.40824: variable 'omit' from source: magic vars 22736 1727204283.40888: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204283.40928: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204283.41194: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204283.41198: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204283.41201: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204283.41205: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204283.41208: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204283.41210: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204283.41393: Set connection var ansible_timeout to 10 22736 1727204283.41699: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204283.41703: Set connection var ansible_shell_executable to /bin/sh 22736 1727204283.41706: Set connection var ansible_shell_type to sh 22736 1727204283.41709: Set connection var ansible_pipelining to False 22736 1727204283.41711: Set connection var ansible_connection to ssh 22736 1727204283.41713: variable 'ansible_shell_executable' from source: unknown 22736 1727204283.41716: variable 'ansible_connection' from source: unknown 22736 1727204283.41719: variable 'ansible_module_compression' from source: unknown 22736 1727204283.41722: variable 'ansible_shell_type' from source: unknown 22736 1727204283.41724: variable 'ansible_shell_executable' from source: unknown 22736 1727204283.41727: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204283.41756: variable 'ansible_pipelining' from source: unknown 22736 1727204283.41767: variable 'ansible_timeout' from source: unknown 22736 1727204283.41780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204283.42338: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204283.42368: variable 'omit' from source: magic vars 22736 1727204283.42382: starting attempt loop 22736 1727204283.42418: running the handler 22736 1727204283.42445: variable 'ansible_facts' from source: unknown 22736 1727204283.42682: _low_level_execute_command(): starting 22736 1727204283.42686: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204283.44138: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204283.44181: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204283.44221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204283.44250: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204283.44306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204283.44550: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204283.44582: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204283.46412: stdout chunk (state=3): >>>/root <<< 22736 1727204283.46510: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204283.46602: stderr chunk (state=3): >>><<< 22736 1727204283.46627: stdout chunk (state=3): >>><<< 22736 1727204283.46653: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204283.46775: _low_level_execute_command(): starting 22736 1727204283.46779: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204283.46661-25164-121215500273571 `" && echo ansible-tmp-1727204283.46661-25164-121215500273571="` echo /root/.ansible/tmp/ansible-tmp-1727204283.46661-25164-121215500273571 `" ) && sleep 0' 22736 1727204283.47405: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204283.47422: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204283.47457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204283.47577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204283.47617: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204283.47702: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204283.49766: stdout chunk (state=3): >>>ansible-tmp-1727204283.46661-25164-121215500273571=/root/.ansible/tmp/ansible-tmp-1727204283.46661-25164-121215500273571 <<< 22736 1727204283.49994: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204283.50001: stdout chunk (state=3): >>><<< 22736 1727204283.50005: stderr chunk (state=3): >>><<< 22736 1727204283.50030: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204283.46661-25164-121215500273571=/root/.ansible/tmp/ansible-tmp-1727204283.46661-25164-121215500273571 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204283.50079: variable 'ansible_module_compression' from source: unknown 22736 1727204283.50159: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 22736 1727204283.50260: variable 'ansible_facts' from source: unknown 22736 1727204283.50545: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204283.46661-25164-121215500273571/AnsiballZ_setup.py 22736 1727204283.50678: Sending initial data 22736 1727204283.50692: Sent initial data (152 bytes) 22736 1727204283.51543: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204283.51566: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22736 1727204283.51653: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204283.51673: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204283.51707: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204283.51780: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204283.53499: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204283.53559: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204283.53614: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204283.46661-25164-121215500273571/AnsiballZ_setup.py" <<< 22736 1727204283.53619: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp5yph49kb /root/.ansible/tmp/ansible-tmp-1727204283.46661-25164-121215500273571/AnsiballZ_setup.py <<< 22736 1727204283.53658: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmp5yph49kb" to remote "/root/.ansible/tmp/ansible-tmp-1727204283.46661-25164-121215500273571/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204283.46661-25164-121215500273571/AnsiballZ_setup.py" <<< 22736 1727204283.56115: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204283.56177: stderr chunk (state=3): >>><<< 22736 1727204283.56196: stdout chunk (state=3): >>><<< 22736 1727204283.56246: done transferring module to remote 22736 1727204283.56266: _low_level_execute_command(): starting 22736 1727204283.56277: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204283.46661-25164-121215500273571/ /root/.ansible/tmp/ansible-tmp-1727204283.46661-25164-121215500273571/AnsiballZ_setup.py && sleep 0' 22736 1727204283.56945: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 <<< 22736 1727204283.56963: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204283.56983: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204283.57006: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 22736 1727204283.57026: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204283.57040: stderr chunk (state=3): >>>debug2: match not found <<< 22736 1727204283.57056: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204283.57077: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 22736 1727204283.57174: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204283.57201: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204283.57277: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204283.59294: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204283.59308: stdout chunk (state=3): >>><<< 22736 1727204283.59323: stderr chunk (state=3): >>><<< 22736 1727204283.59350: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204283.59360: _low_level_execute_command(): starting 22736 1727204283.59370: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204283.46661-25164-121215500273571/AnsiballZ_setup.py && sleep 0' 22736 1727204283.60110: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204283.60162: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204283.60186: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204283.60260: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204285.29584: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "58", "second": "03", "epoch": "1727204283", "epoch_int": "1727204283", "date": "2024-09-24", "time": "14:58:03", "iso8601_micro": "2024-09-24T18:58:03.916437Z", "iso8601": "2024-09-24T18:58:03Z", "iso8601_basic": "20240924T145803916437", "iso8601_basic_short": "20240924T145803", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2835, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 882, "free": 2835}, "nocache": {"free": 3466, "used": 251}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 787, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146764288, "block_size": 4096, "block_total": 64479564, "block_available": 61315128, "block_used": 3164436, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04<<< 22736 1727204285.29605: stdout chunk (state=3): >>>700"}], "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.84912109375, "5m": 0.6767578125, "15m": 0.41943359375}, "ansible_hostnqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 22736 1727204285.31844: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204285.31911: stderr chunk (state=3): >>><<< 22736 1727204285.31915: stdout chunk (state=3): >>><<< 22736 1727204285.31945: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.10.10-100.fc39.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Thu Sep 12 16:02:41 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.6", "ansible_fqdn": "managed-node2", "ansible_hostname": "managed-node2", "ansible_nodename": "managed-node2", "ansible_domain": "", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2892d4fd9a460f25d759cc0f5d8af1", "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Tuesday", "weekday_number": "2", "weeknumber": "39", "day": "24", "hour": "14", "minute": "58", "second": "03", "epoch": "1727204283", "epoch_int": "1727204283", "date": "2024-09-24", "time": "14:58:03", "iso8601_micro": "2024-09-24T18:58:03.916437Z", "iso8601": "2024-09-24T18:58:03Z", "iso8601_basic": "20240924T145803916437", "iso8601_basic_short": "20240924T145803", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_fips": false, "ansible_ssh_host_key_dsa_public": "AAAAB3NzaC1kc3MAAACBALbaOakStW0fyiQZlEbXXncyR87qoLwgowQVWaebe3krm4gvuCt8TEBUfw0mvR4vRmcqUFmhI4AbiuXsKjiKxYFB/ooFAFGN2e6y+1BkIHStqZxat0Y6htEvv337Meighz6u4pGZEgQEXl0ig7Y+tf1HK9SNBPwBoKrPiHstNXETAAAAFQC0R2WwVc0wxArNV55xrf3dfHNWEQAAAIEAo/K5c9iXaO8AN4REpo0lcla4lv+lJi0cx3ULyqXk29wMFoYuCVuURFK5VsZtDamN+tQue7jmD7ahg1el3AY59W2+/jBX/PDqBX5d5hkCT5l1P1pM9W7VWIAnW2bTgBesa35ADM2OfJUMm1IsNOumpK1iQT/XyN91/OxLiGKgrMgAAACAN/CC3jIAXZAZ0zkpFewrJ/MlTMSUDdSEmporHlosvIG/LxawIWdARjc8FkZq5Jfegk13jOPmlayjUxnOM+hf9btoZwUltkLd1Fc9ZcVY1ymW2wgCVuM9rpaoZqnjt78dM3IM44Qp9Sk1/L3S3h1Z+cMYhDvB5DxtlwjZVchXcmQ=", "ansible_ssh_host_key_dsa_public_keytype": "ssh-dss", "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQCjyZhimTUnhmFPwjr+BtR5YHy7jZv+SoyMmBHl0ndk/Fr/iV1MZ+PQJ6OFpiPmaZtSOueURUO+OKS+4S49W/UIwiptYVVaOkV7sdXaewHBjm5YOl2NFydYAB8UITyvBHGaJZOaoke5+fQ3ihL1G3r5GLcwhEYCOV19071uYcYri6mAxRuYo8+UZnhWOLYQWKli76eIy9Vz6bwjFbx6OLorjNqD80e8MSI5md6V7KEGfBgHxjZ5fYUqtMZgywq/a5O7EPqYtcq6vjwbKkY2ItpEyZi968BXPSHD3ivbCDafyHyuhicTsQky5kNlF7aCVRtIB25kC1AsH27GfOLIhv+R/DqyZplFvITz1SWXMHlsQ5eVF6NS6WP8YKN0BDnb2bhohWuM9RWwzBfJ7jApZPbiCk6fJbXlW3rKlmyV1mrxXEsdDm8jspkVADYxIfpc7g0SCda0opXiJlIAcK3mvGp1rk1tL+UIcoMp24l/vFNaLt7c4rtk3DnUuXYkNvQhlqU=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBERJF6hpg0o/wWUU74hKJoOukhqIPlxM2/M9KH90sag6guuJEiarflzNpQPOmQTqrLS1o3XIBk5k8S9nLie2DGo=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIISrwAy4Au1uAR21ys53ni2nEEXzli0s9x5IctlHCqcU", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_distribution": "Fedora", "ansible_distribution_release": "", "ansible_distribution_version": "39", "ansible_distribution_major_version": "39", "ansible_distribution_file_path": "/etc/redhat-release", "ansible_distribution_file_variety": "RedHat", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 6, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 6, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_dns": {"nameservers": ["127.0.0.53"], "options": {"edns0": true, "trust-ad": true}, "search": ["us-east-1.aws.redhat.com"]}, "ansible_local": {}, "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_env": {"SHELL": "/bin/bash", "EDITOR": "/usr/bin/nano", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.9.8 37018 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.9.8 37018 22", "DEBUGINFOD_URLS": "https://debuginfod.fedoraproject.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_is_chroot": false, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_fibre_channel_wwn": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3717, "ansible_memfree_mb": 2835, "ansible_swaptotal_mb": 3716, "ansible_swapfree_mb": 3716, "ansible_memory_mb": {"real": {"total": 3717, "used": 882, "free": 2835}, "nocache": {"free": 3466, "used": 251}, "swap": {"total": 3716, "free": 3716, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_uuid": "ec2892d4-fd9a-460f-25d7-59cc0f5d8af1", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["97924df9-0e6a-4a28-b439-92c447b04700"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "97924df9-0e6a-4a28-b439-92c447b04700", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}, "zram0": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "4096", "partitions": {}, "rotational": "0", "scheduler_mode": "", "sectors": "7612416", "sectorsize": "4096", "size": "3.63 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["97924df9-0e6a-4a28-b439-92c447b04700"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 787, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "ext4", "options": "rw,seclabel,relatime", "dump": 0, "passno": 0, "size_total": 264108294144, "size_available": 251146764288, "block_size": 4096, "block_total": 64479564, "block_available": 61315128, "block_used": 3164436, "inode_total": 16384000, "inode_available": 16302249, "inode_used": 81751, "uuid": "97924df9-0e6a-4a28-b439-92c447b04700"}], "ansible_pkg_mgr": "dnf", "ansible_iscsi_iqn": "", "ansible_lsb": {}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.10.10-100.fc39.x86_64", "root": "UUID=97924df9-0e6a-4a28-b439-92c447b04700", "ro": true, "crashkernel": "auto", "net.ifnames": "0", "rhgb": true, "console": ["tty0", "ttyS0,115200n8"]}, "ansible_loadavg": {"1m": 0.84912109375, "5m": 0.6767578125, "15m": 0.41943359375}, "ansible_hostnqn": "", "ansible_interfaces": ["lo", "eth0"], "ansible_eth0": {"device": "eth0", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::4a44:1e77:128f:34e8", "prefix": "64", "scope": "link"}]}, "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}]}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:02:03:51:a3:4b", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::4a44:1e77:128f:34e8"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::4a44:1e77:128f:34e8"]}, "ansible_service_mgr": "systemd", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204285.32224: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204283.46661-25164-121215500273571/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204285.32234: _low_level_execute_command(): starting 22736 1727204285.32240: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204283.46661-25164-121215500273571/ > /dev/null 2>&1 && sleep 0' 22736 1727204285.32740: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204285.32744: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204285.32747: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 22736 1727204285.32750: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204285.32752: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204285.32811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204285.32817: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204285.32820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204285.32855: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204285.34850: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204285.34903: stderr chunk (state=3): >>><<< 22736 1727204285.34908: stdout chunk (state=3): >>><<< 22736 1727204285.34930: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204285.34938: handler run complete 22736 1727204285.35045: variable 'ansible_facts' from source: unknown 22736 1727204285.35125: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204285.35490: variable 'ansible_facts' from source: unknown 22736 1727204285.35559: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204285.35664: attempt loop complete, returning result 22736 1727204285.35668: _execute() done 22736 1727204285.35670: dumping result to json 22736 1727204285.35698: done dumping result, returning 22736 1727204285.35705: done running TaskExecutor() for managed-node2/TASK: Gathering Facts [12b410aa-8751-4f4a-548a-00000000056d] 22736 1727204285.35711: sending task result for task 12b410aa-8751-4f4a-548a-00000000056d 22736 1727204285.35978: done sending task result for task 12b410aa-8751-4f4a-548a-00000000056d 22736 1727204285.35981: WORKER PROCESS EXITING ok: [managed-node2] 22736 1727204285.36261: no more pending results, returning what we have 22736 1727204285.36264: results queue empty 22736 1727204285.36265: checking for any_errors_fatal 22736 1727204285.36266: done checking for any_errors_fatal 22736 1727204285.36267: checking for max_fail_percentage 22736 1727204285.36268: done checking for max_fail_percentage 22736 1727204285.36269: checking to see if all hosts have failed and the running result is not ok 22736 1727204285.36269: done checking to see if all hosts have failed 22736 1727204285.36270: getting the remaining hosts for this loop 22736 1727204285.36271: done getting the remaining hosts for this loop 22736 1727204285.36274: getting the next task for host managed-node2 22736 1727204285.36278: done getting next task for host managed-node2 22736 1727204285.36279: ^ task is: TASK: meta (flush_handlers) 22736 1727204285.36281: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204285.36284: getting variables 22736 1727204285.36285: in VariableManager get_vars() 22736 1727204285.36309: Calling all_inventory to load vars for managed-node2 22736 1727204285.36312: Calling groups_inventory to load vars for managed-node2 22736 1727204285.36314: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204285.36326: Calling all_plugins_play to load vars for managed-node2 22736 1727204285.36328: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204285.36330: Calling groups_plugins_play to load vars for managed-node2 22736 1727204285.37566: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204285.39153: done with get_vars() 22736 1727204285.39176: done getting variables 22736 1727204285.39240: in VariableManager get_vars() 22736 1727204285.39248: Calling all_inventory to load vars for managed-node2 22736 1727204285.39250: Calling groups_inventory to load vars for managed-node2 22736 1727204285.39252: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204285.39256: Calling all_plugins_play to load vars for managed-node2 22736 1727204285.39258: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204285.39260: Calling groups_plugins_play to load vars for managed-node2 22736 1727204285.40416: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204285.42002: done with get_vars() 22736 1727204285.42033: done queuing things up, now waiting for results queue to drain 22736 1727204285.42035: results queue empty 22736 1727204285.42036: checking for any_errors_fatal 22736 1727204285.42040: done checking for any_errors_fatal 22736 1727204285.42041: checking for max_fail_percentage 22736 1727204285.42041: done checking for max_fail_percentage 22736 1727204285.42047: checking to see if all hosts have failed and the running result is not ok 22736 1727204285.42048: done checking to see if all hosts have failed 22736 1727204285.42048: getting the remaining hosts for this loop 22736 1727204285.42049: done getting the remaining hosts for this loop 22736 1727204285.42052: getting the next task for host managed-node2 22736 1727204285.42055: done getting next task for host managed-node2 22736 1727204285.42057: ^ task is: TASK: Verify network state restored to default 22736 1727204285.42058: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204285.42060: getting variables 22736 1727204285.42061: in VariableManager get_vars() 22736 1727204285.42068: Calling all_inventory to load vars for managed-node2 22736 1727204285.42070: Calling groups_inventory to load vars for managed-node2 22736 1727204285.42072: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204285.42077: Calling all_plugins_play to load vars for managed-node2 22736 1727204285.42079: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204285.42081: Calling groups_plugins_play to load vars for managed-node2 22736 1727204285.43175: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204285.44802: done with get_vars() 22736 1727204285.44825: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:80 Tuesday 24 September 2024 14:58:05 -0400 (0:00:02.057) 0:00:50.233 ***** 22736 1727204285.44896: entering _queue_task() for managed-node2/include_tasks 22736 1727204285.45178: worker is 1 (out of 1 available) 22736 1727204285.45195: exiting _queue_task() for managed-node2/include_tasks 22736 1727204285.45209: done queuing things up, now waiting for results queue to drain 22736 1727204285.45210: waiting for pending results... 22736 1727204285.45412: running TaskExecutor() for managed-node2/TASK: Verify network state restored to default 22736 1727204285.45493: in run() - task 12b410aa-8751-4f4a-548a-000000000078 22736 1727204285.45508: variable 'ansible_search_path' from source: unknown 22736 1727204285.45640: calling self._execute() 22736 1727204285.45644: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204285.45649: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204285.45653: variable 'omit' from source: magic vars 22736 1727204285.45969: variable 'ansible_distribution_major_version' from source: facts 22736 1727204285.45982: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204285.45990: _execute() done 22736 1727204285.45994: dumping result to json 22736 1727204285.45996: done dumping result, returning 22736 1727204285.46004: done running TaskExecutor() for managed-node2/TASK: Verify network state restored to default [12b410aa-8751-4f4a-548a-000000000078] 22736 1727204285.46010: sending task result for task 12b410aa-8751-4f4a-548a-000000000078 22736 1727204285.46109: done sending task result for task 12b410aa-8751-4f4a-548a-000000000078 22736 1727204285.46112: WORKER PROCESS EXITING 22736 1727204285.46152: no more pending results, returning what we have 22736 1727204285.46157: in VariableManager get_vars() 22736 1727204285.46196: Calling all_inventory to load vars for managed-node2 22736 1727204285.46200: Calling groups_inventory to load vars for managed-node2 22736 1727204285.46204: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204285.46230: Calling all_plugins_play to load vars for managed-node2 22736 1727204285.46235: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204285.46239: Calling groups_plugins_play to load vars for managed-node2 22736 1727204285.47508: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204285.49700: done with get_vars() 22736 1727204285.49743: variable 'ansible_search_path' from source: unknown 22736 1727204285.49764: we have included files to process 22736 1727204285.49765: generating all_blocks data 22736 1727204285.49767: done generating all_blocks data 22736 1727204285.49769: processing included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 22736 1727204285.49770: loading included file: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 22736 1727204285.49773: Loading data from /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 22736 1727204285.50321: done processing included file 22736 1727204285.50324: iterating over new_blocks loaded from include file 22736 1727204285.50326: in VariableManager get_vars() 22736 1727204285.50342: done with get_vars() 22736 1727204285.50344: filtering new block on tags 22736 1727204285.50369: done filtering new block on tags 22736 1727204285.50372: done iterating over new_blocks loaded from include file included: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed-node2 22736 1727204285.50378: extending task lists for all hosts with included blocks 22736 1727204285.50426: done extending task lists 22736 1727204285.50427: done processing included files 22736 1727204285.50428: results queue empty 22736 1727204285.50429: checking for any_errors_fatal 22736 1727204285.50431: done checking for any_errors_fatal 22736 1727204285.50432: checking for max_fail_percentage 22736 1727204285.50433: done checking for max_fail_percentage 22736 1727204285.50435: checking to see if all hosts have failed and the running result is not ok 22736 1727204285.50435: done checking to see if all hosts have failed 22736 1727204285.50436: getting the remaining hosts for this loop 22736 1727204285.50438: done getting the remaining hosts for this loop 22736 1727204285.50442: getting the next task for host managed-node2 22736 1727204285.50446: done getting next task for host managed-node2 22736 1727204285.50449: ^ task is: TASK: Check routes and DNS 22736 1727204285.50453: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204285.50455: getting variables 22736 1727204285.50456: in VariableManager get_vars() 22736 1727204285.50467: Calling all_inventory to load vars for managed-node2 22736 1727204285.50471: Calling groups_inventory to load vars for managed-node2 22736 1727204285.50474: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204285.50481: Calling all_plugins_play to load vars for managed-node2 22736 1727204285.50485: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204285.50491: Calling groups_plugins_play to load vars for managed-node2 22736 1727204285.52632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204285.54446: done with get_vars() 22736 1727204285.54475: done getting variables 22736 1727204285.54520: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Tuesday 24 September 2024 14:58:05 -0400 (0:00:00.096) 0:00:50.330 ***** 22736 1727204285.54550: entering _queue_task() for managed-node2/shell 22736 1727204285.54835: worker is 1 (out of 1 available) 22736 1727204285.54851: exiting _queue_task() for managed-node2/shell 22736 1727204285.54863: done queuing things up, now waiting for results queue to drain 22736 1727204285.54865: waiting for pending results... 22736 1727204285.55059: running TaskExecutor() for managed-node2/TASK: Check routes and DNS 22736 1727204285.55151: in run() - task 12b410aa-8751-4f4a-548a-00000000057e 22736 1727204285.55166: variable 'ansible_search_path' from source: unknown 22736 1727204285.55169: variable 'ansible_search_path' from source: unknown 22736 1727204285.55210: calling self._execute() 22736 1727204285.55283: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204285.55293: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204285.55305: variable 'omit' from source: magic vars 22736 1727204285.55630: variable 'ansible_distribution_major_version' from source: facts 22736 1727204285.55646: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204285.55651: variable 'omit' from source: magic vars 22736 1727204285.55685: variable 'omit' from source: magic vars 22736 1727204285.55717: variable 'omit' from source: magic vars 22736 1727204285.55755: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 22736 1727204285.55796: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 22736 1727204285.55815: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 22736 1727204285.55836: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204285.55848: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 22736 1727204285.55882: variable 'inventory_hostname' from source: host vars for 'managed-node2' 22736 1727204285.55887: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204285.55891: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204285.55985: Set connection var ansible_timeout to 10 22736 1727204285.55996: Set connection var ansible_module_compression to ZIP_DEFLATED 22736 1727204285.56005: Set connection var ansible_shell_executable to /bin/sh 22736 1727204285.56008: Set connection var ansible_shell_type to sh 22736 1727204285.56014: Set connection var ansible_pipelining to False 22736 1727204285.56017: Set connection var ansible_connection to ssh 22736 1727204285.56040: variable 'ansible_shell_executable' from source: unknown 22736 1727204285.56043: variable 'ansible_connection' from source: unknown 22736 1727204285.56046: variable 'ansible_module_compression' from source: unknown 22736 1727204285.56049: variable 'ansible_shell_type' from source: unknown 22736 1727204285.56053: variable 'ansible_shell_executable' from source: unknown 22736 1727204285.56056: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204285.56061: variable 'ansible_pipelining' from source: unknown 22736 1727204285.56064: variable 'ansible_timeout' from source: unknown 22736 1727204285.56070: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204285.56201: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204285.56210: variable 'omit' from source: magic vars 22736 1727204285.56216: starting attempt loop 22736 1727204285.56219: running the handler 22736 1727204285.56231: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=False) 22736 1727204285.56250: _low_level_execute_command(): starting 22736 1727204285.56257: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 22736 1727204285.56844: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204285.56848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204285.56851: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204285.56853: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204285.56916: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204285.56920: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204285.56924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204285.56969: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204285.58746: stdout chunk (state=3): >>>/root <<< 22736 1727204285.58854: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204285.58921: stderr chunk (state=3): >>><<< 22736 1727204285.58925: stdout chunk (state=3): >>><<< 22736 1727204285.58949: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204285.58962: _low_level_execute_command(): starting 22736 1727204285.58970: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1727204285.5894916-25213-199331437387401 `" && echo ansible-tmp-1727204285.5894916-25213-199331437387401="` echo /root/.ansible/tmp/ansible-tmp-1727204285.5894916-25213-199331437387401 `" ) && sleep 0' 22736 1727204285.59460: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204285.59464: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204285.59476: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204285.59479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204285.59536: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204285.59540: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 22736 1727204285.59545: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204285.59588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204285.61643: stdout chunk (state=3): >>>ansible-tmp-1727204285.5894916-25213-199331437387401=/root/.ansible/tmp/ansible-tmp-1727204285.5894916-25213-199331437387401 <<< 22736 1727204285.61759: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204285.61812: stderr chunk (state=3): >>><<< 22736 1727204285.61815: stdout chunk (state=3): >>><<< 22736 1727204285.61836: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1727204285.5894916-25213-199331437387401=/root/.ansible/tmp/ansible-tmp-1727204285.5894916-25213-199331437387401 , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204285.61871: variable 'ansible_module_compression' from source: unknown 22736 1727204285.61918: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-22736tjjcxiaw/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 22736 1727204285.61960: variable 'ansible_facts' from source: unknown 22736 1727204285.62033: transferring module to remote /root/.ansible/tmp/ansible-tmp-1727204285.5894916-25213-199331437387401/AnsiballZ_command.py 22736 1727204285.62152: Sending initial data 22736 1727204285.62156: Sent initial data (156 bytes) 22736 1727204285.62640: stderr chunk (state=3): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204285.62644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204285.62647: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 22736 1727204285.62649: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 22736 1727204285.62652: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204285.62704: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204285.62708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204285.62754: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204285.64476: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 22736 1727204285.64512: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 22736 1727204285.64548: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpz89gd04s /root/.ansible/tmp/ansible-tmp-1727204285.5894916-25213-199331437387401/AnsiballZ_command.py <<< 22736 1727204285.64551: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1727204285.5894916-25213-199331437387401/AnsiballZ_command.py" <<< 22736 1727204285.64583: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: do_upload: upload local "/root/.ansible/tmp/ansible-local-22736tjjcxiaw/tmpz89gd04s" to remote "/root/.ansible/tmp/ansible-tmp-1727204285.5894916-25213-199331437387401/AnsiballZ_command.py" <<< 22736 1727204285.64592: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1727204285.5894916-25213-199331437387401/AnsiballZ_command.py" <<< 22736 1727204285.65352: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204285.65432: stderr chunk (state=3): >>><<< 22736 1727204285.65436: stdout chunk (state=3): >>><<< 22736 1727204285.65457: done transferring module to remote 22736 1727204285.65475: _low_level_execute_command(): starting 22736 1727204285.65478: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1727204285.5894916-25213-199331437387401/ /root/.ansible/tmp/ansible-tmp-1727204285.5894916-25213-199331437387401/AnsiballZ_command.py && sleep 0' 22736 1727204285.65964: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204285.65968: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 22736 1727204285.65970: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 22736 1727204285.65973: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 22736 1727204285.65979: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204285.66040: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204285.66042: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204285.66080: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204285.68059: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204285.68124: stderr chunk (state=3): >>><<< 22736 1727204285.68128: stdout chunk (state=3): >>><<< 22736 1727204285.68141: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204285.68144: _low_level_execute_command(): starting 22736 1727204285.68151: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1727204285.5894916-25213-199331437387401/AnsiballZ_command.py && sleep 0' 22736 1727204285.68645: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204285.68649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204285.68652: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 22736 1727204285.68654: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204285.68710: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204285.68714: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204285.68772: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204285.87457: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:02:03:51:a3:4b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3217sec preferred_lft 3217sec\n inet6 fe80::4a44:1e77:128f:34e8/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:58:05.864433", "end": "2024-09-24 14:58:05.873401", "delta": "0:00:00.008968", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 22736 1727204285.89128: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 22736 1727204285.89191: stderr chunk (state=3): >>><<< 22736 1727204285.89195: stdout chunk (state=3): >>><<< 22736 1727204285.89216: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:02:03:51:a3:4b brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 3217sec preferred_lft 3217sec\n inet6 fe80::4a44:1e77:128f:34e8/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8).\n# Do not edit.\n#\n# This file might be symlinked as /etc/resolv.conf. If you're looking at\n# /etc/resolv.conf and seeing this text, you have followed the symlink.\n#\n# This is a dynamic resolv.conf file for connecting local clients to the\n# internal DNS stub resolver of systemd-resolved. This file lists all\n# configured search domains.\n#\n# Run \"resolvectl status\" to see details about the uplink DNS servers\n# currently in use.\n#\n# Third party programs should typically not access this file directly, but only\n# through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a\n# different way, replace this symlink by a static file or a different symlink.\n#\n# See man:systemd-resolved.service(8) for details about the supported modes of\n# operation for /etc/resolv.conf.\n\nnameserver 127.0.0.53\noptions edns0 trust-ad\nsearch us-east-1.aws.redhat.com", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-24 14:58:05.864433", "end": "2024-09-24 14:58:05.873401", "delta": "0:00:00.008968", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 22736 1727204285.89271: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1727204285.5894916-25213-199331437387401/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 22736 1727204285.89279: _low_level_execute_command(): starting 22736 1727204285.89285: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1727204285.5894916-25213-199331437387401/ > /dev/null 2>&1 && sleep 0' 22736 1727204285.89782: stderr chunk (state=2): >>>OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 22736 1727204285.89787: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204285.89790: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 22736 1727204285.89792: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 22736 1727204285.89845: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master <<< 22736 1727204285.89848: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 22736 1727204285.89897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 22736 1727204285.91832: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 22736 1727204285.91879: stderr chunk (state=3): >>><<< 22736 1727204285.91882: stdout chunk (state=3): >>><<< 22736 1727204285.91900: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.3p1, OpenSSL 3.1.4 24 Oct 2023 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 22736 1727204285.91907: handler run complete 22736 1727204285.91931: Evaluated conditional (False): False 22736 1727204285.91946: attempt loop complete, returning result 22736 1727204285.91950: _execute() done 22736 1727204285.91952: dumping result to json 22736 1727204285.91961: done dumping result, returning 22736 1727204285.91970: done running TaskExecutor() for managed-node2/TASK: Check routes and DNS [12b410aa-8751-4f4a-548a-00000000057e] 22736 1727204285.91975: sending task result for task 12b410aa-8751-4f4a-548a-00000000057e 22736 1727204285.92099: done sending task result for task 12b410aa-8751-4f4a-548a-00000000057e 22736 1727204285.92102: WORKER PROCESS EXITING ok: [managed-node2] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008968", "end": "2024-09-24 14:58:05.873401", "rc": 0, "start": "2024-09-24 14:58:05.864433" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:02:03:51:a3:4b brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 3217sec preferred_lft 3217sec inet6 fe80::4a44:1e77:128f:34e8/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # This is /run/systemd/resolve/stub-resolv.conf managed by man:systemd-resolved(8). # Do not edit. # # This file might be symlinked as /etc/resolv.conf. If you're looking at # /etc/resolv.conf and seeing this text, you have followed the symlink. # # This is a dynamic resolv.conf file for connecting local clients to the # internal DNS stub resolver of systemd-resolved. This file lists all # configured search domains. # # Run "resolvectl status" to see details about the uplink DNS servers # currently in use. # # Third party programs should typically not access this file directly, but only # through the symlink at /etc/resolv.conf. To manage man:resolv.conf(5) in a # different way, replace this symlink by a static file or a different symlink. # # See man:systemd-resolved.service(8) for details about the supported modes of # operation for /etc/resolv.conf. nameserver 127.0.0.53 options edns0 trust-ad search us-east-1.aws.redhat.com 22736 1727204285.92200: no more pending results, returning what we have 22736 1727204285.92205: results queue empty 22736 1727204285.92206: checking for any_errors_fatal 22736 1727204285.92208: done checking for any_errors_fatal 22736 1727204285.92208: checking for max_fail_percentage 22736 1727204285.92210: done checking for max_fail_percentage 22736 1727204285.92220: checking to see if all hosts have failed and the running result is not ok 22736 1727204285.92221: done checking to see if all hosts have failed 22736 1727204285.92222: getting the remaining hosts for this loop 22736 1727204285.92224: done getting the remaining hosts for this loop 22736 1727204285.92229: getting the next task for host managed-node2 22736 1727204285.92236: done getting next task for host managed-node2 22736 1727204285.92240: ^ task is: TASK: Verify DNS and network connectivity 22736 1727204285.92243: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204285.92252: getting variables 22736 1727204285.92254: in VariableManager get_vars() 22736 1727204285.92282: Calling all_inventory to load vars for managed-node2 22736 1727204285.92286: Calling groups_inventory to load vars for managed-node2 22736 1727204285.92291: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204285.92302: Calling all_plugins_play to load vars for managed-node2 22736 1727204285.92305: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204285.92308: Calling groups_plugins_play to load vars for managed-node2 22736 1727204285.93583: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204285.95276: done with get_vars() 22736 1727204285.95301: done getting variables 22736 1727204285.95353: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/action) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Tuesday 24 September 2024 14:58:05 -0400 (0:00:00.408) 0:00:50.738 ***** 22736 1727204285.95380: entering _queue_task() for managed-node2/shell 22736 1727204285.95648: worker is 1 (out of 1 available) 22736 1727204285.95663: exiting _queue_task() for managed-node2/shell 22736 1727204285.95676: done queuing things up, now waiting for results queue to drain 22736 1727204285.95677: waiting for pending results... 22736 1727204285.95870: running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity 22736 1727204285.95944: in run() - task 12b410aa-8751-4f4a-548a-00000000057f 22736 1727204285.95960: variable 'ansible_search_path' from source: unknown 22736 1727204285.95964: variable 'ansible_search_path' from source: unknown 22736 1727204285.95998: calling self._execute() 22736 1727204285.96081: variable 'ansible_host' from source: host vars for 'managed-node2' 22736 1727204285.96088: variable 'ansible_ssh_extra_args' from source: host vars for 'managed-node2' 22736 1727204285.96101: variable 'omit' from source: magic vars 22736 1727204285.96417: variable 'ansible_distribution_major_version' from source: facts 22736 1727204285.96428: Evaluated conditional (ansible_distribution_major_version != '6'): True 22736 1727204285.96547: variable 'ansible_facts' from source: unknown 22736 1727204285.97247: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): False 22736 1727204285.97251: when evaluation is False, skipping this task 22736 1727204285.97254: _execute() done 22736 1727204285.97257: dumping result to json 22736 1727204285.97260: done dumping result, returning 22736 1727204285.97267: done running TaskExecutor() for managed-node2/TASK: Verify DNS and network connectivity [12b410aa-8751-4f4a-548a-00000000057f] 22736 1727204285.97272: sending task result for task 12b410aa-8751-4f4a-548a-00000000057f 22736 1727204285.97366: done sending task result for task 12b410aa-8751-4f4a-548a-00000000057f 22736 1727204285.97369: WORKER PROCESS EXITING skipping: [managed-node2] => { "changed": false, "false_condition": "ansible_facts[\"distribution\"] == \"CentOS\"", "skip_reason": "Conditional result was False" } 22736 1727204285.97425: no more pending results, returning what we have 22736 1727204285.97429: results queue empty 22736 1727204285.97431: checking for any_errors_fatal 22736 1727204285.97442: done checking for any_errors_fatal 22736 1727204285.97442: checking for max_fail_percentage 22736 1727204285.97444: done checking for max_fail_percentage 22736 1727204285.97445: checking to see if all hosts have failed and the running result is not ok 22736 1727204285.97446: done checking to see if all hosts have failed 22736 1727204285.97447: getting the remaining hosts for this loop 22736 1727204285.97449: done getting the remaining hosts for this loop 22736 1727204285.97453: getting the next task for host managed-node2 22736 1727204285.97463: done getting next task for host managed-node2 22736 1727204285.97466: ^ task is: TASK: meta (flush_handlers) 22736 1727204285.97468: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204285.97472: getting variables 22736 1727204285.97474: in VariableManager get_vars() 22736 1727204285.97502: Calling all_inventory to load vars for managed-node2 22736 1727204285.97506: Calling groups_inventory to load vars for managed-node2 22736 1727204285.97509: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204285.97523: Calling all_plugins_play to load vars for managed-node2 22736 1727204285.97527: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204285.97530: Calling groups_plugins_play to load vars for managed-node2 22736 1727204285.98747: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204286.00331: done with get_vars() 22736 1727204286.00354: done getting variables 22736 1727204286.00413: in VariableManager get_vars() 22736 1727204286.00424: Calling all_inventory to load vars for managed-node2 22736 1727204286.00426: Calling groups_inventory to load vars for managed-node2 22736 1727204286.00428: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204286.00433: Calling all_plugins_play to load vars for managed-node2 22736 1727204286.00435: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204286.00437: Calling groups_plugins_play to load vars for managed-node2 22736 1727204286.01621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204286.03195: done with get_vars() 22736 1727204286.03227: done queuing things up, now waiting for results queue to drain 22736 1727204286.03229: results queue empty 22736 1727204286.03230: checking for any_errors_fatal 22736 1727204286.03232: done checking for any_errors_fatal 22736 1727204286.03233: checking for max_fail_percentage 22736 1727204286.03234: done checking for max_fail_percentage 22736 1727204286.03234: checking to see if all hosts have failed and the running result is not ok 22736 1727204286.03235: done checking to see if all hosts have failed 22736 1727204286.03236: getting the remaining hosts for this loop 22736 1727204286.03237: done getting the remaining hosts for this loop 22736 1727204286.03240: getting the next task for host managed-node2 22736 1727204286.03243: done getting next task for host managed-node2 22736 1727204286.03244: ^ task is: TASK: meta (flush_handlers) 22736 1727204286.03246: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204286.03248: getting variables 22736 1727204286.03248: in VariableManager get_vars() 22736 1727204286.03255: Calling all_inventory to load vars for managed-node2 22736 1727204286.03257: Calling groups_inventory to load vars for managed-node2 22736 1727204286.03259: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204286.03265: Calling all_plugins_play to load vars for managed-node2 22736 1727204286.03267: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204286.03269: Calling groups_plugins_play to load vars for managed-node2 22736 1727204286.04364: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204286.05971: done with get_vars() 22736 1727204286.05994: done getting variables 22736 1727204286.06040: in VariableManager get_vars() 22736 1727204286.06049: Calling all_inventory to load vars for managed-node2 22736 1727204286.06051: Calling groups_inventory to load vars for managed-node2 22736 1727204286.06053: Calling all_plugins_inventory to load vars for managed-node2 22736 1727204286.06057: Calling all_plugins_play to load vars for managed-node2 22736 1727204286.06059: Calling groups_plugins_inventory to load vars for managed-node2 22736 1727204286.06061: Calling groups_plugins_play to load vars for managed-node2 22736 1727204286.07136: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 22736 1727204286.08703: done with get_vars() 22736 1727204286.08731: done queuing things up, now waiting for results queue to drain 22736 1727204286.08733: results queue empty 22736 1727204286.08734: checking for any_errors_fatal 22736 1727204286.08736: done checking for any_errors_fatal 22736 1727204286.08737: checking for max_fail_percentage 22736 1727204286.08738: done checking for max_fail_percentage 22736 1727204286.08738: checking to see if all hosts have failed and the running result is not ok 22736 1727204286.08739: done checking to see if all hosts have failed 22736 1727204286.08739: getting the remaining hosts for this loop 22736 1727204286.08740: done getting the remaining hosts for this loop 22736 1727204286.08749: getting the next task for host managed-node2 22736 1727204286.08753: done getting next task for host managed-node2 22736 1727204286.08753: ^ task is: None 22736 1727204286.08755: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 22736 1727204286.08756: done queuing things up, now waiting for results queue to drain 22736 1727204286.08756: results queue empty 22736 1727204286.08757: checking for any_errors_fatal 22736 1727204286.08757: done checking for any_errors_fatal 22736 1727204286.08758: checking for max_fail_percentage 22736 1727204286.08759: done checking for max_fail_percentage 22736 1727204286.08759: checking to see if all hosts have failed and the running result is not ok 22736 1727204286.08760: done checking to see if all hosts have failed 22736 1727204286.08761: getting the next task for host managed-node2 22736 1727204286.08763: done getting next task for host managed-node2 22736 1727204286.08763: ^ task is: None 22736 1727204286.08764: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed-node2 : ok=82 changed=3 unreachable=0 failed=0 skipped=74 rescued=0 ignored=1 Tuesday 24 September 2024 14:58:06 -0400 (0:00:00.134) 0:00:50.873 ***** =============================================================================== fedora.linux_system_roles.network : Check which services are running ---- 2.43s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.37s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 2.34s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Gathering Facts --------------------------------------------------------- 2.18s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tests_ethernet_nm.yml:6 Gathering Facts --------------------------------------------------------- 2.06s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:77 Install iproute --------------------------------------------------------- 2.06s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:16 fedora.linux_system_roles.network : Check which packages are installed --- 1.64s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 1.21s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Gathering Facts --------------------------------------------------------- 1.19s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:3 Create veth interface lsr27 --------------------------------------------- 1.18s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/manage_test_interface.yml:27 Gathering Facts --------------------------------------------------------- 1.18s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:13 Gathering Facts --------------------------------------------------------- 1.14s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile.yml:3 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 1.12s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Gathering Facts --------------------------------------------------------- 1.11s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_ethernet.yml:50 fedora.linux_system_roles.network : Check which packages are installed --- 1.10s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Re-test connectivity ---------------- 1.08s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Gather the minimum subset of ansible_facts required by the network role test --- 1.08s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Gathering Facts --------------------------------------------------------- 1.07s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/down_profile+delete_interface.yml:5 Gathering Facts --------------------------------------------------------- 1.05s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/remove_profile.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.03s /tmp/collections-twx/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 22736 1727204286.08861: RUNNING CLEANUP