[DEPRECATION WARNING]: ANSIBLE_COLLECTIONS_PATHS option, does not fit var naming standard, use the singular form ANSIBLE_COLLECTIONS_PATH instead. This feature will be removed from ansible-core in version 2.19. Deprecation warnings can be disabled by setting deprecation_warnings=False in ansible.cfg. 11579 1726882170.95704: starting run ansible-playbook [core 2.17.4] config file = None configured module search path = ['/root/.ansible/plugins/modules', '/usr/share/ansible/plugins/modules'] ansible python module location = /usr/local/lib/python3.12/site-packages/ansible ansible collection location = /tmp/collections-spT executable location = /usr/local/bin/ansible-playbook python version = 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] (/usr/bin/python3.12) jinja version = 3.1.4 libyaml = True No config file found; using defaults 11579 1726882170.95989: Added group all to inventory 11579 1726882170.95991: Added group ungrouped to inventory 11579 1726882170.95998: Group all now contains ungrouped 11579 1726882170.96000: Examining possible inventory source: /tmp/network-Kc3/inventory.yml 11579 1726882171.08397: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/cache 11579 1726882171.08456: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py 11579 1726882171.08474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory 11579 1726882171.08536: Loading InventoryModule 'host_list' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py 11579 1726882171.08606: Loaded config def from plugin (inventory/script) 11579 1726882171.08609: Loading InventoryModule 'script' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py 11579 1726882171.08648: Loading InventoryModule 'auto' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py 11579 1726882171.08734: Loaded config def from plugin (inventory/yaml) 11579 1726882171.08737: Loading InventoryModule 'yaml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py 11579 1726882171.08824: Loading InventoryModule 'ini' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/ini.py 11579 1726882171.09245: Loading InventoryModule 'toml' from /usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/toml.py 11579 1726882171.09248: Attempting to use plugin host_list (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/host_list.py) 11579 1726882171.09251: Attempting to use plugin script (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/script.py) 11579 1726882171.09257: Attempting to use plugin auto (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/auto.py) 11579 1726882171.09262: Loading data from /tmp/network-Kc3/inventory.yml 11579 1726882171.09331: /tmp/network-Kc3/inventory.yml was not parsable by auto 11579 1726882171.09395: Attempting to use plugin yaml (/usr/local/lib/python3.12/site-packages/ansible/plugins/inventory/yaml.py) 11579 1726882171.09433: Loading data from /tmp/network-Kc3/inventory.yml 11579 1726882171.09514: group all already in inventory 11579 1726882171.09521: set inventory_file for managed_node1 11579 1726882171.09525: set inventory_dir for managed_node1 11579 1726882171.09526: Added host managed_node1 to inventory 11579 1726882171.09528: Added host managed_node1 to group all 11579 1726882171.09529: set ansible_host for managed_node1 11579 1726882171.09530: set ansible_ssh_extra_args for managed_node1 11579 1726882171.09532: set inventory_file for managed_node2 11579 1726882171.09535: set inventory_dir for managed_node2 11579 1726882171.09536: Added host managed_node2 to inventory 11579 1726882171.09537: Added host managed_node2 to group all 11579 1726882171.09538: set ansible_host for managed_node2 11579 1726882171.09538: set ansible_ssh_extra_args for managed_node2 11579 1726882171.09541: set inventory_file for managed_node3 11579 1726882171.09543: set inventory_dir for managed_node3 11579 1726882171.09543: Added host managed_node3 to inventory 11579 1726882171.09544: Added host managed_node3 to group all 11579 1726882171.09545: set ansible_host for managed_node3 11579 1726882171.09546: set ansible_ssh_extra_args for managed_node3 11579 1726882171.09548: Reconcile groups and hosts in inventory. 11579 1726882171.09551: Group ungrouped now contains managed_node1 11579 1726882171.09553: Group ungrouped now contains managed_node2 11579 1726882171.09555: Group ungrouped now contains managed_node3 11579 1726882171.09630: '/usr/local/lib/python3.12/site-packages/ansible/plugins/vars/__init__' skipped due to reserved name 11579 1726882171.09753: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments 11579 1726882171.09802: Loading ModuleDocFragment 'vars_plugin_staging' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/vars_plugin_staging.py 11579 1726882171.09829: Loaded config def from plugin (vars/host_group_vars) 11579 1726882171.09831: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=False, class_only=True) 11579 1726882171.09838: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/vars 11579 1726882171.09846: Loading VarsModule 'host_group_vars' from /usr/local/lib/python3.12/site-packages/ansible/plugins/vars/host_group_vars.py (found_in_cache=True, class_only=False) 11579 1726882171.09888: Loading CacheModule 'memory' from /usr/local/lib/python3.12/site-packages/ansible/plugins/cache/memory.py (found_in_cache=True, class_only=False) 11579 1726882171.10158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882171.10227: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py 11579 1726882171.10252: Loaded config def from plugin (connection/local) 11579 1726882171.10254: Loading Connection 'local' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/local.py (found_in_cache=False, class_only=True) 11579 1726882171.10630: Loaded config def from plugin (connection/paramiko_ssh) 11579 1726882171.10632: Loading Connection 'paramiko_ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/paramiko_ssh.py (found_in_cache=False, class_only=True) 11579 1726882171.11180: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11579 1726882171.11208: Loaded config def from plugin (connection/psrp) 11579 1726882171.11210: Loading Connection 'psrp' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/psrp.py (found_in_cache=False, class_only=True) 11579 1726882171.11752: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11579 1726882171.11788: Loaded config def from plugin (connection/ssh) 11579 1726882171.11791: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=False, class_only=True) 11579 1726882171.13596: Loading ModuleDocFragment 'connection_pipelining' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/connection_pipelining.py (found_in_cache=True, class_only=False) 11579 1726882171.13620: Loaded config def from plugin (connection/winrm) 11579 1726882171.13622: Loading Connection 'winrm' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/winrm.py (found_in_cache=False, class_only=True) 11579 1726882171.13643: '/usr/local/lib/python3.12/site-packages/ansible/plugins/shell/__init__' skipped due to reserved name 11579 1726882171.13687: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py 11579 1726882171.13727: Loaded config def from plugin (shell/cmd) 11579 1726882171.13729: Loading ShellModule 'cmd' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/cmd.py (found_in_cache=False, class_only=True) 11579 1726882171.13745: Loading ModuleDocFragment 'shell_windows' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_windows.py (found_in_cache=True, class_only=False) 11579 1726882171.13780: Loaded config def from plugin (shell/powershell) 11579 1726882171.13781: Loading ShellModule 'powershell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/powershell.py (found_in_cache=False, class_only=True) 11579 1726882171.13821: Loading ModuleDocFragment 'shell_common' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/shell_common.py 11579 1726882171.13922: Loaded config def from plugin (shell/sh) 11579 1726882171.13923: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=False, class_only=True) 11579 1726882171.13944: '/usr/local/lib/python3.12/site-packages/ansible/plugins/become/__init__' skipped due to reserved name 11579 1726882171.14018: Loaded config def from plugin (become/runas) 11579 1726882171.14021: Loading BecomeModule 'runas' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/runas.py (found_in_cache=False, class_only=True) 11579 1726882171.14129: Loaded config def from plugin (become/su) 11579 1726882171.14131: Loading BecomeModule 'su' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/su.py (found_in_cache=False, class_only=True) 11579 1726882171.14224: Loaded config def from plugin (become/sudo) 11579 1726882171.14226: Loading BecomeModule 'sudo' from /usr/local/lib/python3.12/site-packages/ansible/plugins/become/sudo.py (found_in_cache=False, class_only=True) running playbook inside collection fedora.linux_system_roles 11579 1726882171.14247: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 11579 1726882171.14461: in VariableManager get_vars() 11579 1726882171.14475: done with get_vars() 11579 1726882171.14562: trying /usr/local/lib/python3.12/site-packages/ansible/modules 11579 1726882171.16922: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action 11579 1726882171.16992: in VariableManager get_vars() 11579 1726882171.16998: done with get_vars() 11579 1726882171.17000: variable 'playbook_dir' from source: magic vars 11579 1726882171.17001: variable 'ansible_playbook_python' from source: magic vars 11579 1726882171.17001: variable 'ansible_config_file' from source: magic vars 11579 1726882171.17002: variable 'groups' from source: magic vars 11579 1726882171.17002: variable 'omit' from source: magic vars 11579 1726882171.17002: variable 'ansible_version' from source: magic vars 11579 1726882171.17003: variable 'ansible_check_mode' from source: magic vars 11579 1726882171.17003: variable 'ansible_diff_mode' from source: magic vars 11579 1726882171.17004: variable 'ansible_forks' from source: magic vars 11579 1726882171.17004: variable 'ansible_inventory_sources' from source: magic vars 11579 1726882171.17005: variable 'ansible_skip_tags' from source: magic vars 11579 1726882171.17005: variable 'ansible_limit' from source: magic vars 11579 1726882171.17005: variable 'ansible_run_tags' from source: magic vars 11579 1726882171.17006: variable 'ansible_verbosity' from source: magic vars 11579 1726882171.17039: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml 11579 1726882171.17742: in VariableManager get_vars() 11579 1726882171.17758: done with get_vars() 11579 1726882171.17767: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml 11579 1726882171.18686: in VariableManager get_vars() 11579 1726882171.18702: done with get_vars() 11579 1726882171.18711: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11579 1726882171.18811: in VariableManager get_vars() 11579 1726882171.18827: done with get_vars() 11579 1726882171.18982: in VariableManager get_vars() 11579 1726882171.18997: done with get_vars() 11579 1726882171.19005: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11579 1726882171.19072: in VariableManager get_vars() 11579 1726882171.19088: done with get_vars() 11579 1726882171.19365: in VariableManager get_vars() 11579 1726882171.19378: done with get_vars() 11579 1726882171.19382: variable 'omit' from source: magic vars 11579 1726882171.19401: variable 'omit' from source: magic vars 11579 1726882171.19433: in VariableManager get_vars() 11579 1726882171.19443: done with get_vars() 11579 1726882171.19485: in VariableManager get_vars() 11579 1726882171.19500: done with get_vars() 11579 1726882171.19533: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11579 1726882171.19804: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11579 1726882171.19924: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11579 1726882171.20300: in VariableManager get_vars() 11579 1726882171.20313: done with get_vars() 11579 1726882171.20591: trying /usr/local/lib/python3.12/site-packages/ansible/modules/__pycache__ 11579 1726882171.20677: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__ redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11579 1726882171.21714: in VariableManager get_vars() 11579 1726882171.21729: done with get_vars() 11579 1726882171.21738: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml statically imported: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml 11579 1726882171.21901: in VariableManager get_vars() 11579 1726882171.21919: done with get_vars() 11579 1726882171.22035: in VariableManager get_vars() 11579 1726882171.22051: done with get_vars() 11579 1726882171.22328: in VariableManager get_vars() 11579 1726882171.22345: done with get_vars() 11579 1726882171.22349: variable 'omit' from source: magic vars 11579 1726882171.22373: variable 'omit' from source: magic vars 11579 1726882171.22414: in VariableManager get_vars() 11579 1726882171.22428: done with get_vars() 11579 1726882171.22447: in VariableManager get_vars() 11579 1726882171.22462: done with get_vars() 11579 1726882171.22490: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/defaults/main.yml 11579 1726882171.22604: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/meta/main.yml 11579 1726882171.24131: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml 11579 1726882171.24508: in VariableManager get_vars() 11579 1726882171.24529: done with get_vars() redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11579 1726882171.26526: in VariableManager get_vars() 11579 1726882171.26547: done with get_vars() 11579 1726882171.26556: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml statically imported: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml 11579 1726882171.27047: in VariableManager get_vars() 11579 1726882171.27068: done with get_vars() 11579 1726882171.27128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback 11579 1726882171.27143: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__ redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug redirecting (type: callback) ansible.builtin.debug to ansible.posix.debug 11579 1726882171.27376: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py 11579 1726882171.27545: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.debug) 11579 1726882171.27548: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.debug' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) 11579 1726882171.27578: '/usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__init__' skipped due to reserved name 11579 1726882171.27608: Loading ModuleDocFragment 'default_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/default_callback.py (found_in_cache=True, class_only=False) 11579 1726882171.27784: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py 11579 1726882171.27851: Loaded config def from plugin (callback/default) 11579 1726882171.27854: Loading CallbackModule 'default' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/default.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11579 1726882171.28977: Loaded config def from plugin (callback/junit) 11579 1726882171.28979: Loading CallbackModule 'junit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/junit.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11579 1726882171.29015: Loading ModuleDocFragment 'result_format_callback' from /usr/local/lib/python3.12/site-packages/ansible/plugins/doc_fragments/result_format_callback.py (found_in_cache=True, class_only=False) 11579 1726882171.29052: Loaded config def from plugin (callback/minimal) 11579 1726882171.29054: Loading CallbackModule 'minimal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/minimal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11579 1726882171.29079: Loading CallbackModule 'oneline' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/oneline.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) 11579 1726882171.29123: Loaded config def from plugin (callback/tree) 11579 1726882171.29125: Loading CallbackModule 'tree' from /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/tree.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) redirecting (type: callback) ansible.builtin.profile_tasks to ansible.posix.profile_tasks 11579 1726882171.29205: Loaded config def from plugin (callback/ansible_collections.ansible.posix.plugins.callback.profile_tasks) 11579 1726882171.29207: Loading CallbackModule 'ansible_collections.ansible.posix.plugins.callback.profile_tasks' from /tmp/collections-spT/ansible_collections/ansible/posix/plugins/callback/profile_tasks.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/callback/__pycache__:/usr/local/lib/python3.12/site-packages/ansible/plugins/callback) (found_in_cache=False, class_only=True) Skipping callback 'default', as we already have a stdout callback. Skipping callback 'minimal', as we already have a stdout callback. Skipping callback 'oneline', as we already have a stdout callback. PLAYBOOK: tests_bond_nm.yml **************************************************** 2 plays in /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml 11579 1726882171.29226: in VariableManager get_vars() 11579 1726882171.29235: done with get_vars() 11579 1726882171.29239: in VariableManager get_vars() 11579 1726882171.29243: done with get_vars() 11579 1726882171.29246: variable 'omit' from source: magic vars 11579 1726882171.29268: in VariableManager get_vars() 11579 1726882171.29276: done with get_vars() 11579 1726882171.29290: variable 'omit' from source: magic vars PLAY [Run playbook 'playbooks/tests_bond.yml' with nm as provider] ************* 11579 1726882171.29658: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy 11579 1726882171.29710: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py 11579 1726882171.29736: getting the remaining hosts for this loop 11579 1726882171.29738: done getting the remaining hosts for this loop 11579 1726882171.29740: getting the next task for host managed_node1 11579 1726882171.29742: done getting next task for host managed_node1 11579 1726882171.29743: ^ task is: TASK: Gathering Facts 11579 1726882171.29745: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882171.29746: getting variables 11579 1726882171.29747: in VariableManager get_vars() 11579 1726882171.29753: Calling all_inventory to load vars for managed_node1 11579 1726882171.29754: Calling groups_inventory to load vars for managed_node1 11579 1726882171.29756: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882171.29765: Calling all_plugins_play to load vars for managed_node1 11579 1726882171.29772: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882171.29774: Calling groups_plugins_play to load vars for managed_node1 11579 1726882171.29798: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882171.29832: done with get_vars() 11579 1726882171.29837: done getting variables 11579 1726882171.29880: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 Friday 20 September 2024 21:29:31 -0400 (0:00:00.007) 0:00:00.007 ****** 11579 1726882171.29897: entering _queue_task() for managed_node1/gather_facts 11579 1726882171.29898: Creating lock for gather_facts 11579 1726882171.30180: worker is 1 (out of 1 available) 11579 1726882171.30191: exiting _queue_task() for managed_node1/gather_facts 11579 1726882171.30207: done queuing things up, now waiting for results queue to drain 11579 1726882171.30209: waiting for pending results... 11579 1726882171.30511: running TaskExecutor() for managed_node1/TASK: Gathering Facts 11579 1726882171.30517: in run() - task 12673a56-9f93-f197-7423-0000000000cc 11579 1726882171.30521: variable 'ansible_search_path' from source: unknown 11579 1726882171.30524: calling self._execute() 11579 1726882171.30567: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882171.30580: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882171.30611: variable 'omit' from source: magic vars 11579 1726882171.30717: variable 'omit' from source: magic vars 11579 1726882171.30752: variable 'omit' from source: magic vars 11579 1726882171.30789: variable 'omit' from source: magic vars 11579 1726882171.30850: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882171.30891: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882171.30927: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882171.30949: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882171.30964: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882171.31002: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882171.31010: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882171.31016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882171.31204: Set connection var ansible_timeout to 10 11579 1726882171.31208: Set connection var ansible_shell_type to sh 11579 1726882171.31210: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882171.31212: Set connection var ansible_shell_executable to /bin/sh 11579 1726882171.31215: Set connection var ansible_pipelining to False 11579 1726882171.31216: Set connection var ansible_connection to ssh 11579 1726882171.31225: variable 'ansible_shell_executable' from source: unknown 11579 1726882171.31227: variable 'ansible_connection' from source: unknown 11579 1726882171.31229: variable 'ansible_module_compression' from source: unknown 11579 1726882171.31231: variable 'ansible_shell_type' from source: unknown 11579 1726882171.31233: variable 'ansible_shell_executable' from source: unknown 11579 1726882171.31235: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882171.31236: variable 'ansible_pipelining' from source: unknown 11579 1726882171.31238: variable 'ansible_timeout' from source: unknown 11579 1726882171.31240: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882171.31449: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882171.31457: variable 'omit' from source: magic vars 11579 1726882171.31462: starting attempt loop 11579 1726882171.31465: running the handler 11579 1726882171.31478: variable 'ansible_facts' from source: unknown 11579 1726882171.31496: _low_level_execute_command(): starting 11579 1726882171.31506: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882171.31974: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882171.32008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882171.32011: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882171.32014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882171.32065: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882171.32069: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882171.32071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882171.32123: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882171.33829: stdout chunk (state=3): >>>/root <<< 11579 1726882171.33978: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882171.33981: stdout chunk (state=3): >>><<< 11579 1726882171.33983: stderr chunk (state=3): >>><<< 11579 1726882171.34005: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882171.34026: _low_level_execute_command(): starting 11579 1726882171.34104: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882171.3401163-11594-16815708766745 `" && echo ansible-tmp-1726882171.3401163-11594-16815708766745="` echo /root/.ansible/tmp/ansible-tmp-1726882171.3401163-11594-16815708766745 `" ) && sleep 0' 11579 1726882171.34709: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882171.34773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882171.34791: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882171.34829: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882171.34903: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882171.36773: stdout chunk (state=3): >>>ansible-tmp-1726882171.3401163-11594-16815708766745=/root/.ansible/tmp/ansible-tmp-1726882171.3401163-11594-16815708766745 <<< 11579 1726882171.36938: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882171.36941: stdout chunk (state=3): >>><<< 11579 1726882171.36944: stderr chunk (state=3): >>><<< 11579 1726882171.37099: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882171.3401163-11594-16815708766745=/root/.ansible/tmp/ansible-tmp-1726882171.3401163-11594-16815708766745 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882171.37102: variable 'ansible_module_compression' from source: unknown 11579 1726882171.37104: ANSIBALLZ: Using generic lock for ansible.legacy.setup 11579 1726882171.37106: ANSIBALLZ: Acquiring lock 11579 1726882171.37109: ANSIBALLZ: Lock acquired: 139873763448672 11579 1726882171.37111: ANSIBALLZ: Creating module 11579 1726882171.85250: ANSIBALLZ: Writing module into payload 11579 1726882171.85406: ANSIBALLZ: Writing module 11579 1726882171.85434: ANSIBALLZ: Renaming module 11579 1726882171.85451: ANSIBALLZ: Done creating module 11579 1726882171.85478: variable 'ansible_facts' from source: unknown 11579 1726882171.85490: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882171.85508: _low_level_execute_command(): starting 11579 1726882171.85519: _low_level_execute_command(): executing: /bin/sh -c 'echo PLATFORM; uname; echo FOUND; command -v '"'"'python3.12'"'"'; command -v '"'"'python3.11'"'"'; command -v '"'"'python3.10'"'"'; command -v '"'"'python3.9'"'"'; command -v '"'"'python3.8'"'"'; command -v '"'"'python3.7'"'"'; command -v '"'"'/usr/bin/python3'"'"'; command -v '"'"'python3'"'"'; echo ENDFOUND && sleep 0' 11579 1726882171.86225: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882171.86245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882171.86348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882171.86500: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882171.86629: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882171.86680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882171.88366: stdout chunk (state=3): >>>PLATFORM <<< 11579 1726882171.88430: stdout chunk (state=3): >>>Linux <<< 11579 1726882171.88461: stdout chunk (state=3): >>>FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND <<< 11579 1726882171.88652: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882171.88656: stdout chunk (state=3): >>><<< 11579 1726882171.88658: stderr chunk (state=3): >>><<< 11579 1726882171.88680: _low_level_execute_command() done: rc=0, stdout=PLATFORM Linux FOUND /usr/bin/python3.12 /usr/bin/python3 /usr/bin/python3 ENDFOUND , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882171.88702 [managed_node1]: found interpreters: ['/usr/bin/python3.12', '/usr/bin/python3', '/usr/bin/python3'] 11579 1726882171.88792: _low_level_execute_command(): starting 11579 1726882171.88798: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 && sleep 0' 11579 1726882171.88935: Sending initial data 11579 1726882171.88938: Sent initial data (1181 bytes) 11579 1726882171.89400: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882171.89461: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882171.89521: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882171.89550: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882171.89609: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882171.89632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882171.93023: stdout chunk (state=3): >>>{"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} <<< 11579 1726882171.93599: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882171.93602: stdout chunk (state=3): >>><<< 11579 1726882171.93605: stderr chunk (state=3): >>><<< 11579 1726882171.93607: _low_level_execute_command() done: rc=0, stdout={"platform_dist_result": [], "osrelease_content": "NAME=\"CentOS Stream\"\nVERSION=\"10 (Coughlan)\"\nID=\"centos\"\nID_LIKE=\"rhel fedora\"\nVERSION_ID=\"10\"\nPLATFORM_ID=\"platform:el10\"\nPRETTY_NAME=\"CentOS Stream 10 (Coughlan)\"\nANSI_COLOR=\"0;31\"\nLOGO=\"fedora-logo-icon\"\nCPE_NAME=\"cpe:/o:centos:centos:10\"\nHOME_URL=\"https://centos.org/\"\nVENDOR_NAME=\"CentOS\"\nVENDOR_URL=\"https://centos.org/\"\nBUG_REPORT_URL=\"https://issues.redhat.com/\"\nREDHAT_SUPPORT_PRODUCT=\"Red Hat Enterprise Linux 10\"\nREDHAT_SUPPORT_PRODUCT_VERSION=\"CentOS Stream\"\n"} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882171.93610: variable 'ansible_facts' from source: unknown 11579 1726882171.93612: variable 'ansible_facts' from source: unknown 11579 1726882171.93614: variable 'ansible_module_compression' from source: unknown 11579 1726882171.93616: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11579 1726882171.93654: variable 'ansible_facts' from source: unknown 11579 1726882171.93823: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882171.3401163-11594-16815708766745/AnsiballZ_setup.py 11579 1726882171.93969: Sending initial data 11579 1726882171.94073: Sent initial data (153 bytes) 11579 1726882171.94616: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882171.94629: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882171.94642: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882171.94657: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882171.94670: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882171.94680: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882171.94691: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882171.94727: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882171.94795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882171.94813: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882171.94840: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882171.94908: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882171.96417: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882171.96486: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882171.96552: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpbjtjm0n_ /root/.ansible/tmp/ansible-tmp-1726882171.3401163-11594-16815708766745/AnsiballZ_setup.py <<< 11579 1726882171.96558: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882171.3401163-11594-16815708766745/AnsiballZ_setup.py" <<< 11579 1726882171.96600: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpbjtjm0n_" to remote "/root/.ansible/tmp/ansible-tmp-1726882171.3401163-11594-16815708766745/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882171.3401163-11594-16815708766745/AnsiballZ_setup.py" <<< 11579 1726882171.98075: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882171.98117: stderr chunk (state=3): >>><<< 11579 1726882171.98121: stdout chunk (state=3): >>><<< 11579 1726882171.98230: done transferring module to remote 11579 1726882171.98233: _low_level_execute_command(): starting 11579 1726882171.98235: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882171.3401163-11594-16815708766745/ /root/.ansible/tmp/ansible-tmp-1726882171.3401163-11594-16815708766745/AnsiballZ_setup.py && sleep 0' 11579 1726882171.98780: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882171.98796: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882171.98812: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882171.98852: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11579 1726882171.98907: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882171.98960: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882171.98981: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882171.98995: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882171.99074: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882172.00829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882172.00832: stdout chunk (state=3): >>><<< 11579 1726882172.00835: stderr chunk (state=3): >>><<< 11579 1726882172.00849: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882172.00857: _low_level_execute_command(): starting 11579 1726882172.00865: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882171.3401163-11594-16815708766745/AnsiballZ_setup.py && sleep 0' 11579 1726882172.01459: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882172.01473: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882172.01489: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882172.01511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882172.01577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882172.01624: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882172.01641: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882172.01672: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882172.01750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882172.03860: stdout chunk (state=3): >>>import _frozen_importlib # frozen <<< 11579 1726882172.03898: stdout chunk (state=3): >>>import _imp # builtin <<< 11579 1726882172.03901: stdout chunk (state=3): >>>import '_thread' # import '_warnings' # <<< 11579 1726882172.03924: stdout chunk (state=3): >>>import '_weakref' # <<< 11579 1726882172.03985: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 11579 1726882172.04049: stdout chunk (state=3): >>>import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook <<< 11579 1726882172.04084: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # # installed zipimport hook <<< 11579 1726882172.04129: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' <<< 11579 1726882172.04163: stdout chunk (state=3): >>>import '_codecs' # <<< 11579 1726882172.04174: stdout chunk (state=3): >>>import 'codecs' # <<< 11579 1726882172.04220: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11579 1726882172.04251: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e7bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e78bb00> <<< 11579 1726882172.04292: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' <<< 11579 1726882172.04322: stdout chunk (state=3): >>>import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e7bea50> import '_signal' # <<< 11579 1726882172.04352: stdout chunk (state=3): >>>import '_abc' # import 'abc' # <<< 11579 1726882172.04365: stdout chunk (state=3): >>>import 'io' # <<< 11579 1726882172.04403: stdout chunk (state=3): >>>import '_stat' # import 'stat' # <<< 11579 1726882172.04486: stdout chunk (state=3): >>>import '_collections_abc' # <<< 11579 1726882172.04514: stdout chunk (state=3): >>>import 'genericpath' # import 'posixpath' # <<< 11579 1726882172.04549: stdout chunk (state=3): >>>import 'os' # <<< 11579 1726882172.04580: stdout chunk (state=3): >>>import '_sitebuiltins' # <<< 11579 1726882172.04607: stdout chunk (state=3): >>>Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' <<< 11579 1726882172.04642: stdout chunk (state=3): >>>Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 11579 1726882172.04662: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' <<< 11579 1726882172.04674: stdout chunk (state=3): >>>import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e7cd130> <<< 11579 1726882172.04721: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py <<< 11579 1726882172.04740: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e7cdfa0> <<< 11579 1726882172.04767: stdout chunk (state=3): >>>import 'site' # <<< 11579 1726882172.04798: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. <<< 11579 1726882172.05164: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11579 1726882172.05200: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 11579 1726882172.05203: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 11579 1726882172.05232: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11579 1726882172.05268: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11579 1726882172.05307: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py <<< 11579 1726882172.05311: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' <<< 11579 1726882172.05361: stdout chunk (state=3): >>>import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5cbda0> <<< 11579 1726882172.05368: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' <<< 11579 1726882172.05398: stdout chunk (state=3): >>>import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5cbfe0> <<< 11579 1726882172.05412: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py <<< 11579 1726882172.05454: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' <<< 11579 1726882172.05469: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11579 1726882172.05516: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11579 1726882172.05526: stdout chunk (state=3): >>>import 'itertools' # <<< 11579 1726882172.05572: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e6037a0> <<< 11579 1726882172.05600: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py <<< 11579 1726882172.05614: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e603e30> import '_collections' # <<< 11579 1726882172.05671: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5e3a70> import '_functools' # <<< 11579 1726882172.05703: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5e1190> <<< 11579 1726882172.05805: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5c8f50> <<< 11579 1726882172.05824: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 11579 1726882172.05840: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 11579 1726882172.05872: stdout chunk (state=3): >>>import '_sre' # <<< 11579 1726882172.05899: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11579 1726882172.05923: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' <<< 11579 1726882172.05964: stdout chunk (state=3): >>>import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e623710> <<< 11579 1726882172.05982: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e622330> <<< 11579 1726882172.06018: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5e2060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5ca810> <<< 11579 1726882172.06072: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py <<< 11579 1726882172.06091: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e6587a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5c81d0> <<< 11579 1726882172.06126: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' <<< 11579 1726882172.06151: stdout chunk (state=3): >>># extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e658c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e658b00> <<< 11579 1726882172.06190: stdout chunk (state=3): >>># extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 11579 1726882172.06203: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e658ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5c6cf0> <<< 11579 1726882172.06247: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' <<< 11579 1726882172.06264: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py <<< 11579 1726882172.06298: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 11579 1726882172.06316: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e6595b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e659280> import 'importlib.machinery' # <<< 11579 1726882172.06374: stdout chunk (state=3): >>># /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 11579 1726882172.06377: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e65a4b0> <<< 11579 1726882172.06410: stdout chunk (state=3): >>>import 'importlib.util' # import 'runpy' # <<< 11579 1726882172.06413: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11579 1726882172.06455: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11579 1726882172.06504: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e6706e0> <<< 11579 1726882172.06523: stdout chunk (state=3): >>>import 'errno' # <<< 11579 1726882172.06526: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 11579 1726882172.06562: stdout chunk (state=3): >>>import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e671df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 11579 1726882172.06572: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11579 1726882172.06606: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e672c60> <<< 11579 1726882172.06656: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e6732c0> <<< 11579 1726882172.06673: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e6721b0> <<< 11579 1726882172.06697: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 11579 1726882172.06741: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 11579 1726882172.06744: stdout chunk (state=3): >>># extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' <<< 11579 1726882172.06762: stdout chunk (state=3): >>>import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e673d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e673470> <<< 11579 1726882172.06829: stdout chunk (state=3): >>>import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e65a510> <<< 11579 1726882172.06881: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11579 1726882172.06916: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11579 1726882172.06936: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e373b90> <<< 11579 1726882172.06975: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11579 1726882172.06994: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e39c620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e39c3b0> <<< 11579 1726882172.07015: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e39c650> <<< 11579 1726882172.07056: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py <<< 11579 1726882172.07068: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11579 1726882172.07115: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11579 1726882172.07235: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e39cf80> <<< 11579 1726882172.07354: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e39d970> <<< 11579 1726882172.07383: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e39c830> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e371d30> <<< 11579 1726882172.07423: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11579 1726882172.07427: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' <<< 11579 1726882172.07459: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' <<< 11579 1726882172.07491: stdout chunk (state=3): >>>import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e39ed20> <<< 11579 1726882172.07497: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e39da90> <<< 11579 1726882172.07535: stdout chunk (state=3): >>>import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e65ac00> <<< 11579 1726882172.07538: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11579 1726882172.07592: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 11579 1726882172.07624: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11579 1726882172.07648: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' <<< 11579 1726882172.07676: stdout chunk (state=3): >>>import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e3cb080> <<< 11579 1726882172.07756: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 11579 1726882172.07759: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 11579 1726882172.07788: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11579 1726882172.07791: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11579 1726882172.07828: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e3eb410> <<< 11579 1726882172.07856: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11579 1726882172.07895: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11579 1726882172.07958: stdout chunk (state=3): >>>import 'ntpath' # <<< 11579 1726882172.07999: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e44c1d0> <<< 11579 1726882172.08003: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11579 1726882172.08024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11579 1726882172.08051: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11579 1726882172.08097: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11579 1726882172.08216: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e44e930> <<< 11579 1726882172.08348: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e44c2f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e4191f0> <<< 11579 1726882172.08379: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dd292e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e3ea240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e39fc50> <<< 11579 1726882172.08595: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7feb9e3ea330> <<< 11579 1726882172.08834: stdout chunk (state=3): >>># zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_z70eq_82/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available <<< 11579 1726882172.08883: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.08913: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py <<< 11579 1726882172.08937: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' <<< 11579 1726882172.08964: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11579 1726882172.09052: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' <<< 11579 1726882172.09076: stdout chunk (state=3): >>># /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dd8afc0> import '_typing' # <<< 11579 1726882172.09274: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dd69eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dd69010> # zipimport: zlib available <<< 11579 1726882172.09302: stdout chunk (state=3): >>>import 'ansible' # <<< 11579 1726882172.09319: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.09349: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11579 1726882172.09378: stdout chunk (state=3): >>>import 'ansible.module_utils' # <<< 11579 1726882172.09407: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.10754: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.12319: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dd88e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9ddbe9f0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9ddbe780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9ddbe090> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9ddbe4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dd8bc50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9ddbf770> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9ddbf9b0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9ddbfef0> <<< 11579 1726882172.12377: stdout chunk (state=3): >>>import 'pwd' # <<< 11579 1726882172.12382: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11579 1726882172.12387: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11579 1726882172.12391: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc25d00> <<< 11579 1726882172.12444: stdout chunk (state=3): >>># extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc27920> <<< 11579 1726882172.12447: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py <<< 11579 1726882172.12463: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11579 1726882172.12531: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc282f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11579 1726882172.12534: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc29490> <<< 11579 1726882172.12583: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' <<< 11579 1726882172.12771: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' <<< 11579 1726882172.12916: stdout chunk (state=3): >>>import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc2bf80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc302c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc2a240> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11579 1726882172.12919: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc33fb0> <<< 11579 1726882172.12938: stdout chunk (state=3): >>>import '_tokenize' # <<< 11579 1726882172.13004: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc32a80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc327e0> <<< 11579 1726882172.13058: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11579 1726882172.13091: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc32d50> <<< 11579 1726882172.13119: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc2a750> <<< 11579 1726882172.13148: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc78200> <<< 11579 1726882172.13169: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py <<< 11579 1726882172.13235: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc783b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 11579 1726882172.13269: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc79e50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc79be0> <<< 11579 1726882172.13328: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11579 1726882172.13469: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11579 1726882172.13473: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc7c350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc7a480> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' <<< 11579 1726882172.13475: stdout chunk (state=3): >>>import '_string' # <<< 11579 1726882172.13503: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc7fad0> <<< 11579 1726882172.13622: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc7c4a0> <<< 11579 1726882172.13673: stdout chunk (state=3): >>># extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' <<< 11579 1726882172.13713: stdout chunk (state=3): >>># extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc80e00> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc7d010> <<< 11579 1726882172.13775: stdout chunk (state=3): >>># extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc80c50> <<< 11579 1726882172.13862: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc78530> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' <<< 11579 1726882172.13886: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9db0c290> <<< 11579 1726882172.14014: stdout chunk (state=3): >>># extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9db0d2e0> <<< 11579 1726882172.14032: stdout chunk (state=3): >>>import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc82a20> <<< 11579 1726882172.14098: stdout chunk (state=3): >>># extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc83dd0> <<< 11579 1726882172.14122: stdout chunk (state=3): >>>import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc82630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available <<< 11579 1726882172.14208: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.14281: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.14325: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # <<< 11579 1726882172.14411: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 11579 1726882172.14525: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.14573: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.15088: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.15623: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 11579 1726882172.15646: stdout chunk (state=3): >>>import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 11579 1726882172.15720: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' <<< 11579 1726882172.15735: stdout chunk (state=3): >>># extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9db11580> <<< 11579 1726882172.15795: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' <<< 11579 1726882172.15862: stdout chunk (state=3): >>>import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db12330> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db0d4f0> <<< 11579 1726882172.15881: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available <<< 11579 1726882172.15945: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.15958: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # # zipimport: zlib available <<< 11579 1726882172.16207: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.16231: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db12120> # zipimport: zlib available <<< 11579 1726882172.16674: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.17107: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.17179: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.17326: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available <<< 11579 1726882172.17345: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # # zipimport: zlib available <<< 11579 1726882172.17412: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.17485: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 11579 1726882172.17640: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # <<< 11579 1726882172.17654: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available <<< 11579 1726882172.17837: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.18065: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11579 1726882172.18111: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11579 1726882172.18192: stdout chunk (state=3): >>>import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db13560> # zipimport: zlib available <<< 11579 1726882172.18265: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.18332: stdout chunk (state=3): >>>import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # <<< 11579 1726882172.18359: stdout chunk (state=3): >>>import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # <<< 11579 1726882172.18460: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # <<< 11579 1726882172.18463: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.18504: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.18626: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11579 1726882172.18662: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11579 1726882172.18697: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11579 1726882172.18772: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9db1e150> <<< 11579 1726882172.18806: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db1bec0> <<< 11579 1726882172.19018: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available <<< 11579 1726882172.19043: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py <<< 11579 1726882172.19065: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py <<< 11579 1726882172.19081: stdout chunk (state=3): >>># code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 11579 1726882172.19101: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11579 1726882172.19154: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' <<< 11579 1726882172.19178: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py <<< 11579 1726882172.19188: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' <<< 11579 1726882172.19244: stdout chunk (state=3): >>>import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc06b10> <<< 11579 1726882172.19277: stdout chunk (state=3): >>>import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dcfe7e0> <<< 11579 1726882172.19359: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db1e330> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db0d490> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # <<< 11579 1726882172.19386: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.19402: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.19424: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11579 1726882172.19473: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 11579 1726882172.19522: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11579 1726882172.19525: stdout chunk (state=3): >>>import 'ansible.modules' # # zipimport: zlib available <<< 11579 1726882172.19581: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.19642: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.19657: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.19718: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.19721: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.19760: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.19797: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.19839: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.namespace' # <<< 11579 1726882172.19842: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.19921: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.19977: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.20019: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.20040: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.typing' # <<< 11579 1726882172.20217: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.20220: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.20379: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.20414: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.20469: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' <<< 11579 1726882172.20494: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py <<< 11579 1726882172.20517: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' <<< 11579 1726882172.20535: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py <<< 11579 1726882172.20557: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' <<< 11579 1726882172.20724: stdout chunk (state=3): >>>import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dbb2150> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d7a0110> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' <<< 11579 1726882172.20745: stdout chunk (state=3): >>># extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9d7a06b0> <<< 11579 1726882172.20785: stdout chunk (state=3): >>>import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db980b0> <<< 11579 1726882172.20808: stdout chunk (state=3): >>>import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dbb2cf0> <<< 11579 1726882172.20836: stdout chunk (state=3): >>>import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dbb0830> <<< 11579 1726882172.20855: stdout chunk (state=3): >>>import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dbb0470> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py <<< 11579 1726882172.21024: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9d7a3410> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d7a2cc0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9d7a2ea0> <<< 11579 1726882172.21039: stdout chunk (state=3): >>>import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d7a20f0> <<< 11579 1726882172.21058: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py <<< 11579 1726882172.21135: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' <<< 11579 1726882172.21157: stdout chunk (state=3): >>>import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d7a35c0> <<< 11579 1726882172.21168: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py <<< 11579 1726882172.21198: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' <<< 11579 1726882172.21229: stdout chunk (state=3): >>># extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' <<< 11579 1726882172.21242: stdout chunk (state=3): >>># extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9d8020f0> <<< 11579 1726882172.21257: stdout chunk (state=3): >>>import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d800110> <<< 11579 1726882172.21299: stdout chunk (state=3): >>>import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dbb0500> import 'ansible.module_utils.facts.timeout' # <<< 11579 1726882172.21306: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.collector' # <<< 11579 1726882172.21328: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # <<< 11579 1726882172.21395: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.21408: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.21462: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.facter' # <<< 11579 1726882172.21474: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.21520: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.21604: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.other.ohai' # <<< 11579 1726882172.21607: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # <<< 11579 1726882172.21830: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11579 1726882172.21834: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available <<< 11579 1726882172.21862: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.chroot' # <<< 11579 1726882172.21881: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.21930: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.21987: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.22042: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.22098: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.utils' # <<< 11579 1726882172.22213: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available <<< 11579 1726882172.22583: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.23006: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.distribution' # <<< 11579 1726882172.23023: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.23068: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.23118: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.23151: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.23183: stdout chunk (state=3): >>>import 'ansible.module_utils.compat.datetime' # <<< 11579 1726882172.23203: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available <<< 11579 1726882172.23228: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.23259: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.env' # <<< 11579 1726882172.23269: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.23419: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available <<< 11579 1726882172.23449: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.fips' # <<< 11579 1726882172.23468: stdout chunk (state=3): >>> # zipimport: zlib available <<< 11579 1726882172.23484: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.23515: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.loadavg' # <<< 11579 1726882172.23530: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.23708: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.23731: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d803680> <<< 11579 1726882172.23746: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py <<< 11579 1726882172.23765: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' <<< 11579 1726882172.23872: stdout chunk (state=3): >>>import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d802960> <<< 11579 1726882172.23895: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available <<< 11579 1726882172.23955: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.24018: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.lsb' # <<< 11579 1726882172.24030: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.24215: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available <<< 11579 1726882172.24273: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.24343: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.platform' # <<< 11579 1726882172.24516: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' <<< 11579 1726882172.24542: stdout chunk (state=3): >>># extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' <<< 11579 1726882172.24600: stdout chunk (state=3): >>># extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9d842300> <<< 11579 1726882172.24779: stdout chunk (state=3): >>>import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d832f90> import 'ansible.module_utils.facts.system.python' # <<< 11579 1726882172.24803: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.24846: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.24906: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.system.selinux' # <<< 11579 1726882172.24923: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.24990: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.25071: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.25449: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available <<< 11579 1726882172.25566: stdout chunk (state=3): >>># zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9d855dc0><<< 11579 1726882172.25692: stdout chunk (state=3): >>> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d855d00> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # <<< 11579 1726882172.25698: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.25843: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.25990: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.aix' # <<< 11579 1726882172.26012: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.26102: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.26199: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.26233: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.26277: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.sysctl' # <<< 11579 1726882172.26416: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available <<< 11579 1726882172.26438: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available <<< 11579 1726882172.26469: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.26618: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # <<< 11579 1726882172.26633: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.26740: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.26861: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hpux' # <<< 11579 1726882172.26882: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.26908: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.26936: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.27541: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.27970: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.linux' # <<< 11579 1726882172.27990: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available <<< 11579 1726882172.28088: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.28195: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available <<< 11579 1726882172.28298: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.28510: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.openbsd' # <<< 11579 1726882172.28513: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.28547: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.28700: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.hardware.sunos' # <<< 11579 1726882172.28726: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # <<< 11579 1726882172.28822: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # <<< 11579 1726882172.28837: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.28922: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.29027: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.29231: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.29422: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # <<< 11579 1726882172.29540: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.29544: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available <<< 11579 1726882172.29568: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available <<< 11579 1726882172.29635: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.29736: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.fc_wwn' # <<< 11579 1726882172.29825: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # <<< 11579 1726882172.29838: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.29852: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.29876: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hpux' # <<< 11579 1726882172.29894: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.29945: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.30054: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.hurd' # <<< 11579 1726882172.30057: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.30263: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.30527: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.linux' # <<< 11579 1726882172.30533: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.30586: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.30702: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.iscsi' # <<< 11579 1726882172.30726: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.30740: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available <<< 11579 1726882172.30758: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.30787: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.netbsd' # <<< 11579 1726882172.30822: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.30918: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available <<< 11579 1726882172.30958: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.31038: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.network.sunos' # <<< 11579 1726882172.31086: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available <<< 11579 1726882172.31171: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.31188: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available <<< 11579 1726882172.31209: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.31256: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.31350: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.31384: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.31441: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sysctl' # <<< 11579 1726882172.31452: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available <<< 11579 1726882172.31501: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.31614: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available <<< 11579 1726882172.31750: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.31945: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.linux' # <<< 11579 1726882172.31957: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.31998: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.32042: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.netbsd' # <<< 11579 1726882172.32166: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.32170: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available <<< 11579 1726882172.32247: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.32323: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # <<< 11579 1726882172.32365: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.32424: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.32512: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.ansible_collector' # <<< 11579 1726882172.32529: stdout chunk (state=3): >>>import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # <<< 11579 1726882172.32588: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882172.32774: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' <<< 11579 1726882172.32778: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' <<< 11579 1726882172.33025: stdout chunk (state=3): >>># extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9d657080> <<< 11579 1726882172.33029: stdout chunk (state=3): >>>import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d657260> <<< 11579 1726882172.33031: stdout chunk (state=3): >>>import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d654aa0> <<< 11579 1726882172.44484: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py <<< 11579 1726882172.44518: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d657f80> <<< 11579 1726882172.44613: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d69c8f0> <<< 11579 1726882172.44687: stdout chunk (state=3): >>># /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d69e150> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d69dc10> <<< 11579 1726882172.44925: stdout chunk (state=3): >>>PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame <<< 11579 1726882172.68970: stdout chunk (state=3): >>> <<< 11579 1726882172.69096: stdout chunk (state=3): >>>{"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2969, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 562, "free": 2969}, "nocache": {"free": 3301, "used": 230}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 605, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793837056, "block_size": 4096, "block_total": 65519099, "block_available": 63914511, "block_used": 1604588, "inode_total": 131070960, "inode_available": 131029073, "inode_used": 41887, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LY<<< 11579 1726882172.69106: stdout chunk (state=3): >>>leTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "32", "epoch": "1726882172", "epoch_int": "1726882172", "date": "2024-09-20", "time": "21:29:32", "iso8601_micro": "2024-09-21T01:29:32.648016Z", "iso8601": "2024-09-21T01:29:32Z", "iso8601_basic": "20240920T212932648016", "iso8601_basic_short": "20240920T212932", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.3759765625, "5m": 0.177734375, "15m": 0.08837890625}, "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11579 1726882172.69647: stdout chunk (state=3): >>># clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ <<< 11579 1726882172.69757: stdout chunk (state=3): >>># clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types <<< 11579 1726882172.69761: stdout chunk (state=3): >>># cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd <<< 11579 1726882172.69879: stdout chunk (state=3): >>># destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process <<< 11579 1726882172.69963: stdout chunk (state=3): >>># cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy <<< 11579 1726882172.70443: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json <<< 11579 1726882172.70446: stdout chunk (state=3): >>># destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal <<< 11579 1726882172.70448: stdout chunk (state=3): >>># destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil <<< 11579 1726882172.70451: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy logging <<< 11579 1726882172.70572: stdout chunk (state=3): >>># destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle <<< 11579 1726882172.70607: stdout chunk (state=3): >>># destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl <<< 11579 1726882172.70676: stdout chunk (state=3): >>># destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json <<< 11579 1726882172.70722: stdout chunk (state=3): >>># destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux <<< 11579 1726882172.70879: stdout chunk (state=3): >>># cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat <<< 11579 1726882172.70901: stdout chunk (state=3): >>># destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11579 1726882172.71016: stdout chunk (state=3): >>># destroy sys.monitoring <<< 11579 1726882172.71061: stdout chunk (state=3): >>># destroy _socket # destroy _collections <<< 11579 1726882172.71166: stdout chunk (state=3): >>># destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize <<< 11579 1726882172.71199: stdout chunk (state=3): >>># destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11579 1726882172.71253: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases <<< 11579 1726882172.71303: stdout chunk (state=3): >>># destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 11579 1726882172.71401: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re <<< 11579 1726882172.71414: stdout chunk (state=3): >>># destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks <<< 11579 1726882172.71716: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882172.71902: stderr chunk (state=3): >>><<< 11579 1726882172.71906: stdout chunk (state=3): >>><<< 11579 1726882172.72312: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e7bc4d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e78bb00> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e7bea50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e7cd130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e7cdfa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5cbda0> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5cbfe0> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e6037a0> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e603e30> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5e3a70> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5e1190> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5c8f50> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e623710> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e622330> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5e2060> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5ca810> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e6587a0> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5c81d0> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e658c50> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e658b00> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e658ec0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e5c6cf0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e6595b0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e659280> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e65a4b0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e6706e0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e671df0> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e672c60> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e6732c0> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e6721b0> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e673d40> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e673470> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e65a510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e373b90> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e39c620> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e39c3b0> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e39c650> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e39cf80> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9e39d970> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e39c830> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e371d30> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e39ed20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e39da90> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e65ac00> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e3cb080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e3eb410> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e44c1d0> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e44e930> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e44c2f0> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e4191f0> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dd292e0> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e3ea240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9e39fc50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7feb9e3ea330> # zipimport: found 103 names in '/tmp/ansible_ansible.legacy.setup_payload_z70eq_82/ansible_ansible.legacy.setup_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dd8afc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dd69eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dd69010> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dd88e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9ddbe9f0> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9ddbe780> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9ddbe090> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9ddbe4e0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dd8bc50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9ddbf770> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9ddbf9b0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9ddbfef0> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc25d00> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc27920> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc282f0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc29490> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc2bf80> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc302c0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc2a240> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc33fb0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc32a80> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc327e0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc32d50> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc2a750> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc78200> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc783b0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc79e50> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc79be0> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc7c350> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc7a480> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc7fad0> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc7c4a0> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc80e00> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc7d010> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc80c50> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc78530> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9db0c290> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9db0d2e0> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc82a20> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9dc83dd0> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc82630> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9db11580> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db12330> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db0d4f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db12120> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db13560> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9db1e150> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db1bec0> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dc06b10> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dcfe7e0> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db1e330> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db0d490> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.namespace' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.typing' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/context.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/context.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/process.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/process.cpython-312.pyc' import 'multiprocessing.process' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dbb2150> # /usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/reduction.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/reduction.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc matches /usr/lib64/python3.12/pickle.py # code object from '/usr/lib64/python3.12/__pycache__/pickle.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc matches /usr/lib64/python3.12/_compat_pickle.py # code object from '/usr/lib64/python3.12/__pycache__/_compat_pickle.cpython-312.pyc' import '_compat_pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d7a0110> # extension module '_pickle' loaded from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' # extension module '_pickle' executed from '/usr/lib64/python3.12/lib-dynload/_pickle.cpython-312-x86_64-linux-gnu.so' import '_pickle' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9d7a06b0> import 'pickle' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9db980b0> import 'multiprocessing.reduction' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dbb2cf0> import 'multiprocessing.context' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dbb0830> import 'multiprocessing' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dbb0470> # /usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/pool.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/pool.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc matches /usr/lib64/python3.12/queue.py # code object from '/usr/lib64/python3.12/__pycache__/queue.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc matches /usr/lib64/python3.12/heapq.py # code object from '/usr/lib64/python3.12/__pycache__/heapq.cpython-312.pyc' # extension module '_heapq' loaded from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' # extension module '_heapq' executed from '/usr/lib64/python3.12/lib-dynload/_heapq.cpython-312-x86_64-linux-gnu.so' import '_heapq' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9d7a3410> import 'heapq' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d7a2cc0> # extension module '_queue' loaded from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' # extension module '_queue' executed from '/usr/lib64/python3.12/lib-dynload/_queue.cpython-312-x86_64-linux-gnu.so' import '_queue' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9d7a2ea0> import 'queue' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d7a20f0> # /usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/util.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/util.cpython-312.pyc' import 'multiprocessing.util' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d7a35c0> # /usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/connection.cpython-312.pyc' # extension module '_multiprocessing' loaded from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' # extension module '_multiprocessing' executed from '/usr/lib64/python3.12/lib-dynload/_multiprocessing.cpython-312-x86_64-linux-gnu.so' import '_multiprocessing' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9d8020f0> import 'multiprocessing.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d800110> import 'multiprocessing.pool' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9dbb0500> import 'ansible.module_utils.facts.timeout' # import 'ansible.module_utils.facts.collector' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.facter' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.other.ohai' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.apparmor' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.caps' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.chroot' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.utils' # import 'ansible.module_utils.facts.system.cmdline' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.distribution' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.datetime' # import 'ansible.module_utils.facts.system.date_time' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.env' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.dns' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.fips' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.loadavg' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/glob.py # code object from '/usr/lib64/python3.12/__pycache__/glob.cpython-312.pyc' import 'glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d803680> # /usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc matches /usr/lib64/python3.12/configparser.py # code object from '/usr/lib64/python3.12/__pycache__/configparser.cpython-312.pyc' import 'configparser' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d802960> import 'ansible.module_utils.facts.system.local' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.lsb' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.pkg_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.platform' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc matches /usr/lib64/python3.12/ssl.py # code object from '/usr/lib64/python3.12/__pycache__/ssl.cpython-312.pyc' # extension module '_ssl' loaded from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' # extension module '_ssl' executed from '/usr/lib64/python3.12/lib-dynload/_ssl.cpython-312-x86_64-linux-gnu.so' import '_ssl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9d842300> import 'ssl' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d832f90> import 'ansible.module_utils.facts.system.python' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.selinux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat.version' # import 'ansible.module_utils.facts.system.service_mgr' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.system.ssh_pub_keys' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc matches /usr/lib64/python3.12/getpass.py # code object from '/usr/lib64/python3.12/__pycache__/getpass.cpython-312.pyc' # extension module 'termios' loaded from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' # extension module 'termios' executed from '/usr/lib64/python3.12/lib-dynload/termios.cpython-312-x86_64-linux-gnu.so' import 'termios' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9d855dc0> import 'getpass' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d855d00> import 'ansible.module_utils.facts.system.user' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.base' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.aix' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.sysctl' # import 'ansible.module_utils.facts.hardware.darwin' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.freebsd' # import 'ansible.module_utils.facts.hardware.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.hpux' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.linux' # import 'ansible.module_utils.facts.hardware.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.hardware.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.generic_bsd' # import 'ansible.module_utils.facts.network.aix' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.darwin' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.fc_wwn' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.freebsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.hurd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.iscsi' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.nvme' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.network.sunos' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.base' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sysctl' # import 'ansible.module_utils.facts.virtual.freebsd' # import 'ansible.module_utils.facts.virtual.dragonfly' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.hpux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.linux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.netbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.openbsd' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.virtual.sunos' # import 'ansible.module_utils.facts.default_collectors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.facts.ansible_collector' # import 'ansible.module_utils.facts.compat' # import 'ansible.module_utils.facts' # # zipimport: zlib available # /usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc matches /usr/lib64/python3.12/encodings/idna.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/idna.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc matches /usr/lib64/python3.12/stringprep.py # code object from '/usr/lib64/python3.12/__pycache__/stringprep.cpython-312.pyc' # extension module 'unicodedata' loaded from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' # extension module 'unicodedata' executed from '/usr/lib64/python3.12/lib-dynload/unicodedata.cpython-312-x86_64-linux-gnu.so' import 'unicodedata' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7feb9d657080> import 'stringprep' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d657260> import 'encodings.idna' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d654aa0> # /usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/queues.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/queues.cpython-312.pyc' import 'multiprocessing.queues' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d657f80> # /usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/synchronize.py # code object from '/usr/lib64/python3.12/multiprocessing/__pycache__/synchronize.cpython-312.pyc' import 'multiprocessing.synchronize' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d69c8f0> # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/__init__.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc matches /usr/lib64/python3.12/multiprocessing/dummy/connection.py # code object from '/usr/lib64/python3.12/multiprocessing/dummy/__pycache__/connection.cpython-312.pyc' import 'multiprocessing.dummy.connection' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d69e150> import 'multiprocessing.dummy' # <_frozen_importlib_external.SourceFileLoader object at 0x7feb9d69dc10> PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame PyThreadState_Clear: warning: thread still has a frame {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2969, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 562, "free": 2969}, "nocache": {"free": 3301, "used": 230}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 605, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793837056, "block_size": 4096, "block_total": 65519099, "block_available": 63914511, "block_used": 1604588, "inode_total": 131070960, "inode_available": 131029073, "inode_used": 41887, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "32", "epoch": "1726882172", "epoch_int": "1726882172", "date": "2024-09-20", "time": "21:29:32", "iso8601_micro": "2024-09-21T01:29:32.648016Z", "iso8601": "2024-09-21T01:29:32Z", "iso8601_basic": "20240920T212932648016", "iso8601_basic_short": "20240920T212932", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_apparmor": {"status": "disabled"}, "ansible_fibre_channel_wwn": [], "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_env": {"PYTHONVERBOSE": "1", "SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_fips": false, "ansible_is_chroot": false, "ansible_loadavg": {"1m": 0.3759765625, "5m": 0.177734375, "15m": 0.08837890625}, "ansible_service_mgr": "systemd", "ansible_lsb": {}, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_local": {}, "ansible_pkg_mgr": "dnf", "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # cleanup[2] removing ansible.module_utils.facts.namespace # cleanup[2] removing ansible.module_utils.compat.typing # cleanup[2] removing multiprocessing.process # cleanup[2] removing _compat_pickle # cleanup[2] removing _pickle # cleanup[2] removing pickle # cleanup[2] removing multiprocessing.reduction # cleanup[2] removing multiprocessing.context # cleanup[2] removing __mp_main__ # destroy __main__ # cleanup[2] removing multiprocessing # cleanup[2] removing _heapq # cleanup[2] removing heapq # destroy heapq # cleanup[2] removing _queue # cleanup[2] removing queue # cleanup[2] removing multiprocessing.util # cleanup[2] removing _multiprocessing # cleanup[2] removing multiprocessing.connection # cleanup[2] removing multiprocessing.pool # cleanup[2] removing ansible.module_utils.facts.timeout # cleanup[2] removing ansible.module_utils.facts.collector # cleanup[2] removing ansible.module_utils.facts.other # cleanup[2] removing ansible.module_utils.facts.other.facter # cleanup[2] removing ansible.module_utils.facts.other.ohai # cleanup[2] removing ansible.module_utils.facts.system # cleanup[2] removing ansible.module_utils.facts.system.apparmor # cleanup[2] removing ansible.module_utils.facts.system.caps # cleanup[2] removing ansible.module_utils.facts.system.chroot # cleanup[2] removing ansible.module_utils.facts.utils # cleanup[2] removing ansible.module_utils.facts.system.cmdline # cleanup[2] removing ansible.module_utils.facts.system.distribution # cleanup[2] removing ansible.module_utils.compat.datetime # destroy ansible.module_utils.compat.datetime # cleanup[2] removing ansible.module_utils.facts.system.date_time # cleanup[2] removing ansible.module_utils.facts.system.env # cleanup[2] removing ansible.module_utils.facts.system.dns # cleanup[2] removing ansible.module_utils.facts.system.fips # cleanup[2] removing ansible.module_utils.facts.system.loadavg # cleanup[2] removing glob # cleanup[2] removing configparser # cleanup[2] removing ansible.module_utils.facts.system.local # cleanup[2] removing ansible.module_utils.facts.system.lsb # cleanup[2] removing ansible.module_utils.facts.system.pkg_mgr # cleanup[2] removing ansible.module_utils.facts.system.platform # cleanup[2] removing _ssl # cleanup[2] removing ssl # destroy ssl # cleanup[2] removing ansible.module_utils.facts.system.python # cleanup[2] removing ansible.module_utils.facts.system.selinux # cleanup[2] removing ansible.module_utils.compat.version # destroy ansible.module_utils.compat.version # cleanup[2] removing ansible.module_utils.facts.system.service_mgr # cleanup[2] removing ansible.module_utils.facts.system.ssh_pub_keys # cleanup[2] removing termios # cleanup[2] removing getpass # cleanup[2] removing ansible.module_utils.facts.system.user # cleanup[2] removing ansible.module_utils.facts.hardware # cleanup[2] removing ansible.module_utils.facts.hardware.base # cleanup[2] removing ansible.module_utils.facts.hardware.aix # cleanup[2] removing ansible.module_utils.facts.sysctl # cleanup[2] removing ansible.module_utils.facts.hardware.darwin # cleanup[2] removing ansible.module_utils.facts.hardware.freebsd # cleanup[2] removing ansible.module_utils.facts.hardware.dragonfly # cleanup[2] removing ansible.module_utils.facts.hardware.hpux # cleanup[2] removing ansible.module_utils.facts.hardware.linux # cleanup[2] removing ansible.module_utils.facts.hardware.hurd # cleanup[2] removing ansible.module_utils.facts.hardware.netbsd # cleanup[2] removing ansible.module_utils.facts.hardware.openbsd # cleanup[2] removing ansible.module_utils.facts.hardware.sunos # cleanup[2] removing ansible.module_utils.facts.network # cleanup[2] removing ansible.module_utils.facts.network.base # cleanup[2] removing ansible.module_utils.facts.network.generic_bsd # cleanup[2] removing ansible.module_utils.facts.network.aix # cleanup[2] removing ansible.module_utils.facts.network.darwin # cleanup[2] removing ansible.module_utils.facts.network.dragonfly # cleanup[2] removing ansible.module_utils.facts.network.fc_wwn # cleanup[2] removing ansible.module_utils.facts.network.freebsd # cleanup[2] removing ansible.module_utils.facts.network.hpux # cleanup[2] removing ansible.module_utils.facts.network.hurd # cleanup[2] removing ansible.module_utils.facts.network.linux # cleanup[2] removing ansible.module_utils.facts.network.iscsi # cleanup[2] removing ansible.module_utils.facts.network.nvme # cleanup[2] removing ansible.module_utils.facts.network.netbsd # cleanup[2] removing ansible.module_utils.facts.network.openbsd # cleanup[2] removing ansible.module_utils.facts.network.sunos # cleanup[2] removing ansible.module_utils.facts.virtual # cleanup[2] removing ansible.module_utils.facts.virtual.base # cleanup[2] removing ansible.module_utils.facts.virtual.sysctl # cleanup[2] removing ansible.module_utils.facts.virtual.freebsd # cleanup[2] removing ansible.module_utils.facts.virtual.dragonfly # cleanup[2] removing ansible.module_utils.facts.virtual.hpux # cleanup[2] removing ansible.module_utils.facts.virtual.linux # cleanup[2] removing ansible.module_utils.facts.virtual.netbsd # cleanup[2] removing ansible.module_utils.facts.virtual.openbsd # cleanup[2] removing ansible.module_utils.facts.virtual.sunos # cleanup[2] removing ansible.module_utils.facts.default_collectors # cleanup[2] removing ansible.module_utils.facts.ansible_collector # cleanup[2] removing ansible.module_utils.facts.compat # cleanup[2] removing ansible.module_utils.facts # destroy ansible.module_utils.facts # destroy ansible.module_utils.facts.namespace # destroy ansible.module_utils.facts.other # destroy ansible.module_utils.facts.other.facter # destroy ansible.module_utils.facts.other.ohai # destroy ansible.module_utils.facts.system # destroy ansible.module_utils.facts.system.apparmor # destroy ansible.module_utils.facts.system.caps # destroy ansible.module_utils.facts.system.chroot # destroy ansible.module_utils.facts.system.cmdline # destroy ansible.module_utils.facts.system.distribution # destroy ansible.module_utils.facts.system.date_time # destroy ansible.module_utils.facts.system.env # destroy ansible.module_utils.facts.system.dns # destroy ansible.module_utils.facts.system.fips # destroy ansible.module_utils.facts.system.loadavg # destroy ansible.module_utils.facts.system.local # destroy ansible.module_utils.facts.system.lsb # destroy ansible.module_utils.facts.system.pkg_mgr # destroy ansible.module_utils.facts.system.platform # destroy ansible.module_utils.facts.system.python # destroy ansible.module_utils.facts.system.selinux # destroy ansible.module_utils.facts.system.service_mgr # destroy ansible.module_utils.facts.system.ssh_pub_keys # destroy ansible.module_utils.facts.system.user # destroy ansible.module_utils.facts.utils # destroy ansible.module_utils.facts.hardware # destroy ansible.module_utils.facts.hardware.base # destroy ansible.module_utils.facts.hardware.aix # destroy ansible.module_utils.facts.hardware.darwin # destroy ansible.module_utils.facts.hardware.freebsd # destroy ansible.module_utils.facts.hardware.dragonfly # destroy ansible.module_utils.facts.hardware.hpux # destroy ansible.module_utils.facts.hardware.linux # destroy ansible.module_utils.facts.hardware.hurd # destroy ansible.module_utils.facts.hardware.netbsd # destroy ansible.module_utils.facts.hardware.openbsd # destroy ansible.module_utils.facts.hardware.sunos # destroy ansible.module_utils.facts.sysctl # destroy ansible.module_utils.facts.network # destroy ansible.module_utils.facts.network.base # destroy ansible.module_utils.facts.network.generic_bsd # destroy ansible.module_utils.facts.network.aix # destroy ansible.module_utils.facts.network.darwin # destroy ansible.module_utils.facts.network.dragonfly # destroy ansible.module_utils.facts.network.fc_wwn # destroy ansible.module_utils.facts.network.freebsd # destroy ansible.module_utils.facts.network.hpux # destroy ansible.module_utils.facts.network.hurd # destroy ansible.module_utils.facts.network.linux # destroy ansible.module_utils.facts.network.iscsi # destroy ansible.module_utils.facts.network.nvme # destroy ansible.module_utils.facts.network.netbsd # destroy ansible.module_utils.facts.network.openbsd # destroy ansible.module_utils.facts.network.sunos # destroy ansible.module_utils.facts.virtual # destroy ansible.module_utils.facts.virtual.base # destroy ansible.module_utils.facts.virtual.sysctl # destroy ansible.module_utils.facts.virtual.freebsd # destroy ansible.module_utils.facts.virtual.dragonfly # destroy ansible.module_utils.facts.virtual.hpux # destroy ansible.module_utils.facts.virtual.linux # destroy ansible.module_utils.facts.virtual.netbsd # destroy ansible.module_utils.facts.virtual.openbsd # destroy ansible.module_utils.facts.virtual.sunos # destroy ansible.module_utils.facts.compat # cleanup[2] removing unicodedata # cleanup[2] removing stringprep # cleanup[2] removing encodings.idna # cleanup[2] removing multiprocessing.queues # cleanup[2] removing multiprocessing.synchronize # cleanup[2] removing multiprocessing.dummy.connection # cleanup[2] removing multiprocessing.dummy # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy systemd.journal # destroy systemd.daemon # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy locale # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy logging # destroy ansible.module_utils.facts.default_collectors # destroy ansible.module_utils.facts.ansible_collector # destroy multiprocessing # destroy multiprocessing.queues # destroy multiprocessing.synchronize # destroy multiprocessing.dummy # destroy multiprocessing.pool # destroy signal # destroy pickle # destroy _compat_pickle # destroy _pickle # destroy queue # destroy _heapq # destroy _queue # destroy multiprocessing.reduction # destroy selectors # destroy shlex # destroy fcntl # destroy datetime # destroy subprocess # destroy base64 # destroy _ssl # destroy ansible.module_utils.compat.selinux # destroy getpass # destroy pwd # destroy termios # destroy json # destroy socket # destroy struct # destroy glob # destroy fnmatch # destroy ansible.module_utils.compat.typing # destroy ansible.module_utils.facts.timeout # destroy ansible.module_utils.facts.collector # destroy unicodedata # destroy errno # destroy multiprocessing.connection # destroy tempfile # destroy multiprocessing.context # destroy multiprocessing.process # destroy multiprocessing.util # destroy _multiprocessing # destroy array # destroy multiprocessing.dummy.connection # cleanup[3] wiping encodings.idna # destroy stringprep # cleanup[3] wiping configparser # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # destroy configparser # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy encodings.idna # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _sre # destroy _string # destroy re # destroy itertools # destroy _abc # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks [WARNING]: Platform linux on host managed_node1 is using the discovered Python interpreter at /usr/bin/python3.12, but future installation of another Python interpreter could change the meaning of that path. See https://docs.ansible.com/ansible- core/2.17/reference_appendices/interpreter_discovery.html for more information. 11579 1726882172.76858: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882171.3401163-11594-16815708766745/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882172.76862: _low_level_execute_command(): starting 11579 1726882172.76864: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882171.3401163-11594-16815708766745/ > /dev/null 2>&1 && sleep 0' 11579 1726882172.78173: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882172.78200: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882172.78214: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882172.78225: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882172.78243: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882172.78253: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 11579 1726882172.78382: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882172.78469: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882172.78518: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882172.78726: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11579 1726882172.81240: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882172.81244: stdout chunk (state=3): >>><<< 11579 1726882172.81247: stderr chunk (state=3): >>><<< 11579 1726882172.81249: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11579 1726882172.81251: handler run complete 11579 1726882172.81748: variable 'ansible_facts' from source: unknown 11579 1726882172.81898: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882172.82970: variable 'ansible_facts' from source: unknown 11579 1726882172.83214: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882172.83700: attempt loop complete, returning result 11579 1726882172.83704: _execute() done 11579 1726882172.83706: dumping result to json 11579 1726882172.83708: done dumping result, returning 11579 1726882172.83710: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-f197-7423-0000000000cc] 11579 1726882172.83713: sending task result for task 12673a56-9f93-f197-7423-0000000000cc 11579 1726882172.84954: done sending task result for task 12673a56-9f93-f197-7423-0000000000cc 11579 1726882172.84957: WORKER PROCESS EXITING ok: [managed_node1] 11579 1726882172.85743: no more pending results, returning what we have 11579 1726882172.85746: results queue empty 11579 1726882172.85747: checking for any_errors_fatal 11579 1726882172.85749: done checking for any_errors_fatal 11579 1726882172.85750: checking for max_fail_percentage 11579 1726882172.85751: done checking for max_fail_percentage 11579 1726882172.85752: checking to see if all hosts have failed and the running result is not ok 11579 1726882172.85753: done checking to see if all hosts have failed 11579 1726882172.85754: getting the remaining hosts for this loop 11579 1726882172.85755: done getting the remaining hosts for this loop 11579 1726882172.85759: getting the next task for host managed_node1 11579 1726882172.85764: done getting next task for host managed_node1 11579 1726882172.85766: ^ task is: TASK: meta (flush_handlers) 11579 1726882172.85768: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882172.85772: getting variables 11579 1726882172.85773: in VariableManager get_vars() 11579 1726882172.85853: Calling all_inventory to load vars for managed_node1 11579 1726882172.85857: Calling groups_inventory to load vars for managed_node1 11579 1726882172.85860: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882172.85869: Calling all_plugins_play to load vars for managed_node1 11579 1726882172.85872: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882172.85875: Calling groups_plugins_play to load vars for managed_node1 11579 1726882172.86277: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882172.86883: done with get_vars() 11579 1726882172.86892: done getting variables 11579 1726882172.87188: in VariableManager get_vars() 11579 1726882172.87200: Calling all_inventory to load vars for managed_node1 11579 1726882172.87202: Calling groups_inventory to load vars for managed_node1 11579 1726882172.87204: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882172.87209: Calling all_plugins_play to load vars for managed_node1 11579 1726882172.87211: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882172.87213: Calling groups_plugins_play to load vars for managed_node1 11579 1726882172.87958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882172.88358: done with get_vars() 11579 1726882172.88370: done queuing things up, now waiting for results queue to drain 11579 1726882172.88372: results queue empty 11579 1726882172.88373: checking for any_errors_fatal 11579 1726882172.88375: done checking for any_errors_fatal 11579 1726882172.88380: checking for max_fail_percentage 11579 1726882172.88381: done checking for max_fail_percentage 11579 1726882172.88382: checking to see if all hosts have failed and the running result is not ok 11579 1726882172.88383: done checking to see if all hosts have failed 11579 1726882172.88384: getting the remaining hosts for this loop 11579 1726882172.88384: done getting the remaining hosts for this loop 11579 1726882172.88387: getting the next task for host managed_node1 11579 1726882172.88391: done getting next task for host managed_node1 11579 1726882172.88396: ^ task is: TASK: Include the task 'el_repo_setup.yml' 11579 1726882172.88397: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882172.88400: getting variables 11579 1726882172.88400: in VariableManager get_vars() 11579 1726882172.88408: Calling all_inventory to load vars for managed_node1 11579 1726882172.88410: Calling groups_inventory to load vars for managed_node1 11579 1726882172.88413: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882172.88417: Calling all_plugins_play to load vars for managed_node1 11579 1726882172.88597: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882172.88602: Calling groups_plugins_play to load vars for managed_node1 11579 1726882172.89112: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882172.89705: done with get_vars() 11579 1726882172.89712: done getting variables TASK [Include the task 'el_repo_setup.yml'] ************************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:11 Friday 20 September 2024 21:29:32 -0400 (0:00:01.598) 0:00:01.606 ****** 11579 1726882172.89784: entering _queue_task() for managed_node1/include_tasks 11579 1726882172.89786: Creating lock for include_tasks 11579 1726882172.90934: worker is 1 (out of 1 available) 11579 1726882172.90945: exiting _queue_task() for managed_node1/include_tasks 11579 1726882172.90954: done queuing things up, now waiting for results queue to drain 11579 1726882172.90956: waiting for pending results... 11579 1726882172.91565: running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' 11579 1726882172.91609: in run() - task 12673a56-9f93-f197-7423-000000000006 11579 1726882172.91770: variable 'ansible_search_path' from source: unknown 11579 1726882172.91812: calling self._execute() 11579 1726882172.92151: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882172.92155: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882172.92157: variable 'omit' from source: magic vars 11579 1726882172.92527: _execute() done 11579 1726882172.92531: dumping result to json 11579 1726882172.92533: done dumping result, returning 11579 1726882172.92535: done running TaskExecutor() for managed_node1/TASK: Include the task 'el_repo_setup.yml' [12673a56-9f93-f197-7423-000000000006] 11579 1726882172.92537: sending task result for task 12673a56-9f93-f197-7423-000000000006 11579 1726882172.92814: done sending task result for task 12673a56-9f93-f197-7423-000000000006 11579 1726882172.92817: WORKER PROCESS EXITING 11579 1726882172.92888: no more pending results, returning what we have 11579 1726882172.92895: in VariableManager get_vars() 11579 1726882172.92929: Calling all_inventory to load vars for managed_node1 11579 1726882172.92932: Calling groups_inventory to load vars for managed_node1 11579 1726882172.92936: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882172.92950: Calling all_plugins_play to load vars for managed_node1 11579 1726882172.92953: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882172.92957: Calling groups_plugins_play to load vars for managed_node1 11579 1726882172.93788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882172.94335: done with get_vars() 11579 1726882172.94343: variable 'ansible_search_path' from source: unknown 11579 1726882172.94357: we have included files to process 11579 1726882172.94358: generating all_blocks data 11579 1726882172.94359: done generating all_blocks data 11579 1726882172.94360: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11579 1726882172.94361: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11579 1726882172.94363: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml 11579 1726882172.96452: in VariableManager get_vars() 11579 1726882172.96470: done with get_vars() 11579 1726882172.96483: done processing included file 11579 1726882172.96485: iterating over new_blocks loaded from include file 11579 1726882172.96487: in VariableManager get_vars() 11579 1726882172.96880: done with get_vars() 11579 1726882172.96883: filtering new block on tags 11579 1726882172.96900: done filtering new block on tags 11579 1726882172.96903: in VariableManager get_vars() 11579 1726882172.96915: done with get_vars() 11579 1726882172.96916: filtering new block on tags 11579 1726882172.96933: done filtering new block on tags 11579 1726882172.96935: in VariableManager get_vars() 11579 1726882172.96946: done with get_vars() 11579 1726882172.96948: filtering new block on tags 11579 1726882172.96961: done filtering new block on tags 11579 1726882172.96963: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml for managed_node1 11579 1726882172.96969: extending task lists for all hosts with included blocks 11579 1726882172.97132: done extending task lists 11579 1726882172.97134: done processing included files 11579 1726882172.97135: results queue empty 11579 1726882172.97135: checking for any_errors_fatal 11579 1726882172.97137: done checking for any_errors_fatal 11579 1726882172.97138: checking for max_fail_percentage 11579 1726882172.97139: done checking for max_fail_percentage 11579 1726882172.97140: checking to see if all hosts have failed and the running result is not ok 11579 1726882172.97140: done checking to see if all hosts have failed 11579 1726882172.97141: getting the remaining hosts for this loop 11579 1726882172.97143: done getting the remaining hosts for this loop 11579 1726882172.97145: getting the next task for host managed_node1 11579 1726882172.97150: done getting next task for host managed_node1 11579 1726882172.97152: ^ task is: TASK: Gather the minimum subset of ansible_facts required by the network role test 11579 1726882172.97154: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882172.97156: getting variables 11579 1726882172.97157: in VariableManager get_vars() 11579 1726882172.97164: Calling all_inventory to load vars for managed_node1 11579 1726882172.97166: Calling groups_inventory to load vars for managed_node1 11579 1726882172.97168: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882172.97173: Calling all_plugins_play to load vars for managed_node1 11579 1726882172.97175: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882172.97178: Calling groups_plugins_play to load vars for managed_node1 11579 1726882172.97642: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882172.98274: done with get_vars() 11579 1726882172.98285: done getting variables TASK [Gather the minimum subset of ansible_facts required by the network role test] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:3 Friday 20 September 2024 21:29:32 -0400 (0:00:00.087) 0:00:01.693 ****** 11579 1726882172.98531: entering _queue_task() for managed_node1/setup 11579 1726882172.99638: worker is 1 (out of 1 available) 11579 1726882172.99648: exiting _queue_task() for managed_node1/setup 11579 1726882172.99773: done queuing things up, now waiting for results queue to drain 11579 1726882172.99775: waiting for pending results... 11579 1726882173.00513: running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test 11579 1726882173.00518: in run() - task 12673a56-9f93-f197-7423-0000000000dd 11579 1726882173.00521: variable 'ansible_search_path' from source: unknown 11579 1726882173.00523: variable 'ansible_search_path' from source: unknown 11579 1726882173.00526: calling self._execute() 11579 1726882173.00901: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882173.00905: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882173.00908: variable 'omit' from source: magic vars 11579 1726882173.01662: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882173.06007: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882173.06046: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882173.06087: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882173.06142: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882173.06228: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882173.06477: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882173.06513: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882173.06542: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882173.06587: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882173.06801: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882173.06986: variable 'ansible_facts' from source: unknown 11579 1726882173.07169: variable 'network_test_required_facts' from source: task vars 11579 1726882173.07601: Evaluated conditional (not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts): False 11579 1726882173.07605: when evaluation is False, skipping this task 11579 1726882173.07607: _execute() done 11579 1726882173.07609: dumping result to json 11579 1726882173.07612: done dumping result, returning 11579 1726882173.07614: done running TaskExecutor() for managed_node1/TASK: Gather the minimum subset of ansible_facts required by the network role test [12673a56-9f93-f197-7423-0000000000dd] 11579 1726882173.07616: sending task result for task 12673a56-9f93-f197-7423-0000000000dd 11579 1726882173.07691: done sending task result for task 12673a56-9f93-f197-7423-0000000000dd 11579 1726882173.07700: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not ansible_facts.keys() | list | intersect(network_test_required_facts) == network_test_required_facts", "skip_reason": "Conditional result was False" } 11579 1726882173.07773: no more pending results, returning what we have 11579 1726882173.07777: results queue empty 11579 1726882173.07783: checking for any_errors_fatal 11579 1726882173.07784: done checking for any_errors_fatal 11579 1726882173.07785: checking for max_fail_percentage 11579 1726882173.07786: done checking for max_fail_percentage 11579 1726882173.07787: checking to see if all hosts have failed and the running result is not ok 11579 1726882173.07789: done checking to see if all hosts have failed 11579 1726882173.07789: getting the remaining hosts for this loop 11579 1726882173.07791: done getting the remaining hosts for this loop 11579 1726882173.07796: getting the next task for host managed_node1 11579 1726882173.07807: done getting next task for host managed_node1 11579 1726882173.07809: ^ task is: TASK: Check if system is ostree 11579 1726882173.07812: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882173.07815: getting variables 11579 1726882173.07817: in VariableManager get_vars() 11579 1726882173.07845: Calling all_inventory to load vars for managed_node1 11579 1726882173.07848: Calling groups_inventory to load vars for managed_node1 11579 1726882173.07851: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882173.07862: Calling all_plugins_play to load vars for managed_node1 11579 1726882173.07865: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882173.07868: Calling groups_plugins_play to load vars for managed_node1 11579 1726882173.08344: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882173.08880: done with get_vars() 11579 1726882173.08890: done getting variables TASK [Check if system is ostree] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Friday 20 September 2024 21:29:33 -0400 (0:00:00.104) 0:00:01.798 ****** 11579 1726882173.09028: entering _queue_task() for managed_node1/stat 11579 1726882173.09671: worker is 1 (out of 1 available) 11579 1726882173.09682: exiting _queue_task() for managed_node1/stat 11579 1726882173.09999: done queuing things up, now waiting for results queue to drain 11579 1726882173.10001: waiting for pending results... 11579 1726882173.10342: running TaskExecutor() for managed_node1/TASK: Check if system is ostree 11579 1726882173.10348: in run() - task 12673a56-9f93-f197-7423-0000000000df 11579 1726882173.10447: variable 'ansible_search_path' from source: unknown 11579 1726882173.10455: variable 'ansible_search_path' from source: unknown 11579 1726882173.10497: calling self._execute() 11579 1726882173.10700: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882173.10705: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882173.10708: variable 'omit' from source: magic vars 11579 1726882173.11681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882173.12152: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882173.12343: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882173.12599: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882173.12604: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882173.12829: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882173.12832: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882173.12835: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882173.12837: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882173.13111: Evaluated conditional (not __network_is_ostree is defined): True 11579 1726882173.13121: variable 'omit' from source: magic vars 11579 1726882173.13373: variable 'omit' from source: magic vars 11579 1726882173.13376: variable 'omit' from source: magic vars 11579 1726882173.13379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882173.13382: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882173.13523: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882173.13546: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882173.13561: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882173.13604: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882173.13799: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882173.13803: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882173.14029: Set connection var ansible_timeout to 10 11579 1726882173.14032: Set connection var ansible_shell_type to sh 11579 1726882173.14035: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882173.14037: Set connection var ansible_shell_executable to /bin/sh 11579 1726882173.14039: Set connection var ansible_pipelining to False 11579 1726882173.14041: Set connection var ansible_connection to ssh 11579 1726882173.14043: variable 'ansible_shell_executable' from source: unknown 11579 1726882173.14045: variable 'ansible_connection' from source: unknown 11579 1726882173.14047: variable 'ansible_module_compression' from source: unknown 11579 1726882173.14049: variable 'ansible_shell_type' from source: unknown 11579 1726882173.14051: variable 'ansible_shell_executable' from source: unknown 11579 1726882173.14053: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882173.14055: variable 'ansible_pipelining' from source: unknown 11579 1726882173.14057: variable 'ansible_timeout' from source: unknown 11579 1726882173.14059: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882173.14399: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11579 1726882173.14403: variable 'omit' from source: magic vars 11579 1726882173.14405: starting attempt loop 11579 1726882173.14407: running the handler 11579 1726882173.14410: _low_level_execute_command(): starting 11579 1726882173.14471: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882173.15895: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882173.15914: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882173.16009: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882173.16129: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882173.16245: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882173.16280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882173.17958: stdout chunk (state=3): >>>/root <<< 11579 1726882173.18103: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882173.18106: stdout chunk (state=3): >>><<< 11579 1726882173.18109: stderr chunk (state=3): >>><<< 11579 1726882173.18284: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882173.18300: _low_level_execute_command(): starting 11579 1726882173.18303: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882173.1819508-11663-135177153972299 `" && echo ansible-tmp-1726882173.1819508-11663-135177153972299="` echo /root/.ansible/tmp/ansible-tmp-1726882173.1819508-11663-135177153972299 `" ) && sleep 0' 11579 1726882173.19159: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882173.19266: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882173.19284: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882173.19322: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882173.19363: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11579 1726882173.21828: stdout chunk (state=3): >>>ansible-tmp-1726882173.1819508-11663-135177153972299=/root/.ansible/tmp/ansible-tmp-1726882173.1819508-11663-135177153972299 <<< 11579 1726882173.22072: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882173.22076: stdout chunk (state=3): >>><<< 11579 1726882173.22079: stderr chunk (state=3): >>><<< 11579 1726882173.22120: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882173.1819508-11663-135177153972299=/root/.ansible/tmp/ansible-tmp-1726882173.1819508-11663-135177153972299 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11579 1726882173.22160: variable 'ansible_module_compression' from source: unknown 11579 1726882173.22347: ANSIBALLZ: Using lock for stat 11579 1726882173.22350: ANSIBALLZ: Acquiring lock 11579 1726882173.22352: ANSIBALLZ: Lock acquired: 139873763449728 11579 1726882173.22353: ANSIBALLZ: Creating module 11579 1726882173.40608: ANSIBALLZ: Writing module into payload 11579 1726882173.40706: ANSIBALLZ: Writing module 11579 1726882173.40728: ANSIBALLZ: Renaming module 11579 1726882173.40733: ANSIBALLZ: Done creating module 11579 1726882173.40755: variable 'ansible_facts' from source: unknown 11579 1726882173.40837: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882173.1819508-11663-135177153972299/AnsiballZ_stat.py 11579 1726882173.41020: Sending initial data 11579 1726882173.41023: Sent initial data (153 bytes) 11579 1726882173.41635: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882173.41645: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882173.41662: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882173.41677: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882173.41708: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882173.41770: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882173.41795: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882173.41822: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882173.41835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882173.41922: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11579 1726882173.44301: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882173.44340: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmp89qor59p /root/.ansible/tmp/ansible-tmp-1726882173.1819508-11663-135177153972299/AnsiballZ_stat.py <<< 11579 1726882173.44343: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882173.1819508-11663-135177153972299/AnsiballZ_stat.py" <<< 11579 1726882173.44379: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmp89qor59p" to remote "/root/.ansible/tmp/ansible-tmp-1726882173.1819508-11663-135177153972299/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882173.1819508-11663-135177153972299/AnsiballZ_stat.py" <<< 11579 1726882173.45615: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882173.45677: stderr chunk (state=3): >>><<< 11579 1726882173.45685: stdout chunk (state=3): >>><<< 11579 1726882173.45810: done transferring module to remote 11579 1726882173.45817: _low_level_execute_command(): starting 11579 1726882173.45826: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882173.1819508-11663-135177153972299/ /root/.ansible/tmp/ansible-tmp-1726882173.1819508-11663-135177153972299/AnsiballZ_stat.py && sleep 0' 11579 1726882173.46492: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882173.46509: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882173.46526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882173.46580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882173.46649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882173.46670: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882173.46714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882173.46763: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11579 1726882173.49266: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882173.49270: stdout chunk (state=3): >>><<< 11579 1726882173.49272: stderr chunk (state=3): >>><<< 11579 1726882173.49375: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 11579 1726882173.49379: _low_level_execute_command(): starting 11579 1726882173.49385: _low_level_execute_command(): executing: /bin/sh -c 'PYTHONVERBOSE=1 /usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882173.1819508-11663-135177153972299/AnsiballZ_stat.py && sleep 0' 11579 1726882173.50012: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882173.50089: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882173.50116: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882173.50153: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882173.50214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 4 <<< 11579 1726882173.53397: stdout chunk (state=3): >>>import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # <<< 11579 1726882173.53414: stdout chunk (state=3): >>>import '_io' # import 'marshal' # <<< 11579 1726882173.53461: stdout chunk (state=3): >>>import 'posix' # <<< 11579 1726882173.53524: stdout chunk (state=3): >>>import '_frozen_importlib_external' # <<< 11579 1726882173.53550: stdout chunk (state=3): >>># installing zipimport hook <<< 11579 1726882173.53601: stdout chunk (state=3): >>>import 'time' # import 'zipimport' # <<< 11579 1726882173.53604: stdout chunk (state=3): >>> # installed zipimport hook <<< 11579 1726882173.53672: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc'<<< 11579 1726882173.53708: stdout chunk (state=3): >>> import '_codecs' # <<< 11579 1726882173.53746: stdout chunk (state=3): >>>import 'codecs' # <<< 11579 1726882173.53802: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py <<< 11579 1726882173.53841: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' <<< 11579 1726882173.53864: stdout chunk (state=3): >>>import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dc184d0> <<< 11579 1726882173.53915: stdout chunk (state=3): >>>import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dbe7b30><<< 11579 1726882173.53927: stdout chunk (state=3): >>> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc'<<< 11579 1726882173.53953: stdout chunk (state=3): >>> import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dc1aa50> <<< 11579 1726882173.53987: stdout chunk (state=3): >>>import '_signal' # <<< 11579 1726882173.54029: stdout chunk (state=3): >>> import '_abc' # <<< 11579 1726882173.54046: stdout chunk (state=3): >>> import 'abc' # <<< 11579 1726882173.54119: stdout chunk (state=3): >>>import 'io' # import '_stat' # <<< 11579 1726882173.54137: stdout chunk (state=3): >>> import 'stat' # <<< 11579 1726882173.54321: stdout chunk (state=3): >>>import '_collections_abc' # import 'genericpath' # <<< 11579 1726882173.54325: stdout chunk (state=3): >>> import 'posixpath' # <<< 11579 1726882173.54359: stdout chunk (state=3): >>>import 'os' # <<< 11579 1726882173.54414: stdout chunk (state=3): >>>import '_sitebuiltins' # Processing user site-packages Processing global site-packages<<< 11579 1726882173.54453: stdout chunk (state=3): >>> Adding directory: '/usr/lib64/python3.12/site-packages'<<< 11579 1726882173.54466: stdout chunk (state=3): >>> Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' <<< 11579 1726882173.54537: stdout chunk (state=3): >>># /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc'<<< 11579 1726882173.54548: stdout chunk (state=3): >>> import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da09130> <<< 11579 1726882173.54624: stdout chunk (state=3): >>># /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py<<< 11579 1726882173.54650: stdout chunk (state=3): >>> # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' <<< 11579 1726882173.54699: stdout chunk (state=3): >>>import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da09fa0> import 'site' # <<< 11579 1726882173.54752: stdout chunk (state=3): >>>Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information.<<< 11579 1726882173.54849: stdout chunk (state=3): >>> <<< 11579 1726882173.55120: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py <<< 11579 1726882173.55150: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' <<< 11579 1726882173.55200: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py <<< 11579 1726882173.55230: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' <<< 11579 1726882173.55277: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py <<< 11579 1726882173.55328: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' <<< 11579 1726882173.55614: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da47e60> <<< 11579 1726882173.55617: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da47f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py <<< 11579 1726882173.55661: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' <<< 11579 1726882173.55720: stdout chunk (state=3): >>>import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py<<< 11579 1726882173.55731: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc'<<< 11579 1726882173.55742: stdout chunk (state=3): >>> import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da7f890><<< 11579 1726882173.55774: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py<<< 11579 1726882173.55799: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc'<<< 11579 1726882173.55819: stdout chunk (state=3): >>> import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da7ff20><<< 11579 1726882173.55843: stdout chunk (state=3): >>> import '_collections' # <<< 11579 1726882173.55920: stdout chunk (state=3): >>>import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da5fb30> import '_functools' # <<< 11579 1726882173.56041: stdout chunk (state=3): >>>import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da5d250> <<< 11579 1726882173.56109: stdout chunk (state=3): >>>import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da45010> <<< 11579 1726882173.56150: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py <<< 11579 1726882173.56183: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' <<< 11579 1726882173.56216: stdout chunk (state=3): >>>import '_sre' # <<< 11579 1726882173.56242: stdout chunk (state=3): >>> <<< 11579 1726882173.56255: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py <<< 11579 1726882173.56301: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' <<< 11579 1726882173.56319: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py<<< 11579 1726882173.56341: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc'<<< 11579 1726882173.56401: stdout chunk (state=3): >>> import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da9f800> <<< 11579 1726882173.56422: stdout chunk (state=3): >>>import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da9e450> <<< 11579 1726882173.56468: stdout chunk (state=3): >>># /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py <<< 11579 1726882173.56471: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da5e120><<< 11579 1726882173.56500: stdout chunk (state=3): >>> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da9ccb0><<< 11579 1726882173.56548: stdout chunk (state=3): >>> <<< 11579 1726882173.56597: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' <<< 11579 1726882173.56635: stdout chunk (state=3): >>>import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dad4860> <<< 11579 1726882173.56648: stdout chunk (state=3): >>>import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da44290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py<<< 11579 1726882173.56707: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so'<<< 11579 1726882173.56718: stdout chunk (state=3): >>> # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' <<< 11579 1726882173.56773: stdout chunk (state=3): >>>import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6dad4d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dad4bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' <<< 11579 1726882173.56812: stdout chunk (state=3): >>># extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6dad4fb0><<< 11579 1726882173.56815: stdout chunk (state=3): >>> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da42db0><<< 11579 1726882173.56864: stdout chunk (state=3): >>> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py <<< 11579 1726882173.56879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc'<<< 11579 1726882173.56916: stdout chunk (state=3): >>> # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py<<< 11579 1726882173.56925: stdout chunk (state=3): >>> <<< 11579 1726882173.56965: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' <<< 11579 1726882173.56986: stdout chunk (state=3): >>>import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dad56a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dad5370> <<< 11579 1726882173.57033: stdout chunk (state=3): >>>import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py<<< 11579 1726882173.57043: stdout chunk (state=3): >>> # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' <<< 11579 1726882173.57144: stdout chunk (state=3): >>>import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dad65a0> <<< 11579 1726882173.57174: stdout chunk (state=3): >>>import 'importlib.util' # <<< 11579 1726882173.57207: stdout chunk (state=3): >>>import 'runpy' # <<< 11579 1726882173.57209: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py <<< 11579 1726882173.57255: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' <<< 11579 1726882173.57284: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py <<< 11579 1726882173.57308: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6daec7a0> import 'errno' # <<< 11579 1726882173.57328: stdout chunk (state=3): >>># extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' <<< 11579 1726882173.57397: stdout chunk (state=3): >>># extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6daede80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py <<< 11579 1726882173.57434: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' <<< 11579 1726882173.57437: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py <<< 11579 1726882173.57462: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6daeed20> <<< 11579 1726882173.57516: stdout chunk (state=3): >>># extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6daef320> <<< 11579 1726882173.57561: stdout chunk (state=3): >>>import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6daee270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py <<< 11579 1726882173.57564: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' <<< 11579 1726882173.57617: stdout chunk (state=3): >>># extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6daefda0> <<< 11579 1726882173.57679: stdout chunk (state=3): >>>import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6daef4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dad6510> <<< 11579 1726882173.57708: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py <<< 11579 1726882173.57754: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' <<< 11579 1726882173.57773: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py <<< 11579 1726882173.57787: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' <<< 11579 1726882173.57825: stdout chunk (state=3): >>># extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d877bf0> <<< 11579 1726882173.57860: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' <<< 11579 1726882173.57889: stdout chunk (state=3): >>># extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d8a06b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8a0410> <<< 11579 1726882173.57952: stdout chunk (state=3): >>># extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d8a06e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' <<< 11579 1726882173.58052: stdout chunk (state=3): >>># extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' <<< 11579 1726882173.58232: stdout chunk (state=3): >>># extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d8a1010> <<< 11579 1726882173.58397: stdout chunk (state=3): >>># extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d8a19d0> <<< 11579 1726882173.58417: stdout chunk (state=3): >>>import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8a08c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d875d90> <<< 11579 1726882173.58435: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py <<< 11579 1726882173.58499: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py <<< 11579 1726882173.58511: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8a2d20> <<< 11579 1726882173.58643: stdout chunk (state=3): >>>import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8a0e60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dad6750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py <<< 11579 1726882173.58673: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' <<< 11579 1726882173.58707: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py <<< 11579 1726882173.58779: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8cf080> <<< 11579 1726882173.58860: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py <<< 11579 1726882173.58901: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' <<< 11579 1726882173.58904: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py <<< 11579 1726882173.58920: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' <<< 11579 1726882173.59000: stdout chunk (state=3): >>>import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8ef440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py <<< 11579 1726882173.59051: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' <<< 11579 1726882173.59139: stdout chunk (state=3): >>>import 'ntpath' # <<< 11579 1726882173.59195: stdout chunk (state=3): >>># /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d950260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py <<< 11579 1726882173.59225: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' <<< 11579 1726882173.59262: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py <<< 11579 1726882173.59336: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' <<< 11579 1726882173.59558: stdout chunk (state=3): >>>import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d9529c0> <<< 11579 1726882173.59569: stdout chunk (state=3): >>>import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d950380> <<< 11579 1726882173.59607: stdout chunk (state=3): >>>import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d91d250> <<< 11579 1726882173.59647: stdout chunk (state=3): >>># /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d755340> <<< 11579 1726882173.59676: stdout chunk (state=3): >>>import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8ee240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8a3c50> <<< 11579 1726882173.59820: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/encodings/cp437.pyc' <<< 11579 1726882173.59859: stdout chunk (state=3): >>>import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5a6d8ee840> <<< 11579 1726882173.60025: stdout chunk (state=3): >>># zipimport: found 30 names in '/tmp/ansible_stat_payload_3xbrm89g/ansible_stat_payload.zip' <<< 11579 1726882173.60028: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.60400: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.60404: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py <<< 11579 1726882173.60479: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d7aafc0> <<< 11579 1726882173.60491: stdout chunk (state=3): >>>import '_typing' # <<< 11579 1726882173.60778: stdout chunk (state=3): >>>import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d789eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d789070> # zipimport: zlib available <<< 11579 1726882173.61245: stdout chunk (state=3): >>>import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available <<< 11579 1726882173.62600: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.64029: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d7a8e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py <<< 11579 1726882173.64041: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' <<< 11579 1726882173.64199: stdout chunk (state=3): >>># extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' <<< 11579 1726882173.64202: stdout chunk (state=3): >>>import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d7d6900> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d7d6690> <<< 11579 1726882173.64452: stdout chunk (state=3): >>>import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d7d5fa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d7d63f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d7abc50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d7d76b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d7d78f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py <<< 11579 1726882173.64617: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d7d7e30> import 'pwd' # <<< 11579 1726882173.64641: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py <<< 11579 1726882173.64672: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' <<< 11579 1726882173.64799: stdout chunk (state=3): >>>import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d111c40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d113860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' <<< 11579 1726882173.64827: stdout chunk (state=3): >>>import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d1141d0> <<< 11579 1726882173.64842: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py <<< 11579 1726882173.64879: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' <<< 11579 1726882173.65122: stdout chunk (state=3): >>>import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d115370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d117e30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d78b0b0> <<< 11579 1726882173.65214: stdout chunk (state=3): >>>import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d1160f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py <<< 11579 1726882173.65246: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' <<< 11579 1726882173.65275: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' <<< 11579 1726882173.65357: stdout chunk (state=3): >>>import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d11fda0> import '_tokenize' # <<< 11579 1726882173.65387: stdout chunk (state=3): >>>import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d11e870> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d11e5d0> <<< 11579 1726882173.65418: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' <<< 11579 1726882173.65561: stdout chunk (state=3): >>>import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d11eb40> <<< 11579 1726882173.65571: stdout chunk (state=3): >>>import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d116600> <<< 11579 1726882173.65603: stdout chunk (state=3): >>># extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d1679e0> <<< 11579 1726882173.65637: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d168170> <<< 11579 1726882173.65736: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' <<< 11579 1726882173.65739: stdout chunk (state=3): >>># extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d169bb0> <<< 11579 1726882173.65765: stdout chunk (state=3): >>>import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d169970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py <<< 11579 1726882173.65912: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' <<< 11579 1726882173.65947: stdout chunk (state=3): >>># extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d16c110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d16a2a0> <<< 11579 1726882173.65994: stdout chunk (state=3): >>># /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py <<< 11579 1726882173.66055: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' <<< 11579 1726882173.66058: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # <<< 11579 1726882173.66124: stdout chunk (state=3): >>>import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d16f860> <<< 11579 1726882173.66367: stdout chunk (state=3): >>>import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d16c230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d170380> <<< 11579 1726882173.66471: stdout chunk (state=3): >>># extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d1709e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d170ad0> <<< 11579 1726882173.66489: stdout chunk (state=3): >>>import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d1682c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' <<< 11579 1726882173.66780: stdout chunk (state=3): >>># extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d1fc0e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d1fd250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d172870> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d173c20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d1724b0> # zipimport: zlib available <<< 11579 1726882173.66783: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.compat' # <<< 11579 1726882173.66785: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.66874: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.66961: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.66985: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available <<< 11579 1726882173.67022: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available <<< 11579 1726882173.67146: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.67270: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.68024: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.68872: stdout chunk (state=3): >>>import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # <<< 11579 1726882173.68904: stdout chunk (state=3): >>> import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # <<< 11579 1726882173.68957: stdout chunk (state=3): >>> # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py <<< 11579 1726882173.68975: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc'<<< 11579 1726882173.69050: stdout chunk (state=3): >>> # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' <<< 11579 1726882173.69076: stdout chunk (state=3): >>># extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d001430> <<< 11579 1726882173.69206: stdout chunk (state=3): >>># /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py<<< 11579 1726882173.69209: stdout chunk (state=3): >>> <<< 11579 1726882173.69238: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc'<<< 11579 1726882173.69263: stdout chunk (state=3): >>> import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d002210> <<< 11579 1726882173.69300: stdout chunk (state=3): >>>import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d1fd4f0><<< 11579 1726882173.69364: stdout chunk (state=3): >>> import 'ansible.module_utils.compat.selinux' # <<< 11579 1726882173.69368: stdout chunk (state=3): >>># zipimport: zlib available<<< 11579 1726882173.69373: stdout chunk (state=3): >>> <<< 11579 1726882173.69403: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.69427: stdout chunk (state=3): >>>import 'ansible.module_utils._text' # <<< 11579 1726882173.69450: stdout chunk (state=3): >>> # zipimport: zlib available <<< 11579 1726882173.69844: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.69922: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py <<< 11579 1726882173.69954: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' <<< 11579 1726882173.69977: stdout chunk (state=3): >>>import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d002240> # zipimport: zlib available<<< 11579 1726882173.70002: stdout chunk (state=3): >>> <<< 11579 1726882173.70618: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.71018: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.71038: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.71123: stdout chunk (state=3): >>>import 'ansible.module_utils.common.collections' # # zipimport: zlib available <<< 11579 1726882173.71164: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.71201: stdout chunk (state=3): >>>import 'ansible.module_utils.common.warnings' # <<< 11579 1726882173.71220: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.71342: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.71371: stdout chunk (state=3): >>>import 'ansible.module_utils.errors' # <<< 11579 1726882173.71406: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available <<< 11579 1726882173.71458: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.71491: stdout chunk (state=3): >>>import 'ansible.module_utils.parsing.convert_bool' # <<< 11579 1726882173.71506: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.71723: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.71973: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py <<< 11579 1726882173.72019: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' <<< 11579 1726882173.72039: stdout chunk (state=3): >>>import '_ast' # <<< 11579 1726882173.72087: stdout chunk (state=3): >>>import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d003440> <<< 11579 1726882173.72124: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.72360: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available <<< 11579 1726882173.72425: stdout chunk (state=3): >>>import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available <<< 11579 1726882173.72472: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.72526: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.72607: stdout chunk (state=3): >>># /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py <<< 11579 1726882173.72627: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' <<< 11579 1726882173.72711: stdout chunk (state=3): >>># extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d00df70> <<< 11579 1726882173.72747: stdout chunk (state=3): >>>import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d00bd70> <<< 11579 1726882173.72790: stdout chunk (state=3): >>>import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available <<< 11579 1726882173.73028: stdout chunk (state=3): >>># zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' <<< 11579 1726882173.73054: stdout chunk (state=3): >>># /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py <<< 11579 1726882173.73258: stdout chunk (state=3): >>># code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8269c0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d81a690> <<< 11579 1726882173.73321: stdout chunk (state=3): >>>import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d00e0f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d004290> # destroy ansible.module_utils.distro <<< 11579 1726882173.73343: stdout chunk (state=3): >>>import 'ansible.module_utils.distro' # # zipimport: zlib available <<< 11579 1726882173.73500: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.73504: stdout chunk (state=3): >>>import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # <<< 11579 1726882173.73506: stdout chunk (state=3): >>>import 'ansible.module_utils.basic' # <<< 11579 1726882173.73509: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.73511: stdout chunk (state=3): >>># zipimport: zlib available import 'ansible.modules' # <<< 11579 1726882173.73590: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.73631: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.73929: stdout chunk (state=3): >>># zipimport: zlib available <<< 11579 1726882173.73943: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ <<< 11579 1726882173.74261: stdout chunk (state=3): >>># clear sys.path_importer_cache <<< 11579 1726882173.74287: stdout chunk (state=3): >>># clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat <<< 11579 1726882173.74312: stdout chunk (state=3): >>># cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings <<< 11579 1726882173.74344: stdout chunk (state=3): >>># cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing <<< 11579 1726882173.74382: stdout chunk (state=3): >>># destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging <<< 11579 1726882173.74604: stdout chunk (state=3): >>># cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules <<< 11579 1726882173.74652: stdout chunk (state=3): >>># destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 <<< 11579 1726882173.74730: stdout chunk (state=3): >>># destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path <<< 11579 1726882173.74773: stdout chunk (state=3): >>># destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd <<< 11579 1726882173.74838: stdout chunk (state=3): >>># destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno <<< 11579 1726882173.74967: stdout chunk (state=3): >>># destroy array # destroy datetime # destroy selinux # destroy shutil <<< 11579 1726882173.74998: stdout chunk (state=3): >>># destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix <<< 11579 1726882173.75051: stdout chunk (state=3): >>># cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime <<< 11579 1726882173.75336: stdout chunk (state=3): >>># destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib <<< 11579 1726882173.75406: stdout chunk (state=3): >>># destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time <<< 11579 1726882173.75490: stdout chunk (state=3): >>># destroy _random # destroy _weakref # destroy _hashlib <<< 11579 1726882173.75496: stdout chunk (state=3): >>># destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread <<< 11579 1726882173.75541: stdout chunk (state=3): >>># clear sys.audit hooks <<< 11579 1726882173.76079: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882173.76082: stderr chunk (state=3): >>><<< 11579 1726882173.76085: stdout chunk (state=3): >>><<< 11579 1726882173.76511: _low_level_execute_command() done: rc=0, stdout=import _frozen_importlib # frozen import _imp # builtin import '_thread' # import '_warnings' # import '_weakref' # import '_io' # import 'marshal' # import 'posix' # import '_frozen_importlib_external' # # installing zipimport hook import 'time' # import 'zipimport' # # installed zipimport hook # /usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/encodings/__init__.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/__init__.cpython-312.pyc' import '_codecs' # import 'codecs' # # /usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc matches /usr/lib64/python3.12/encodings/aliases.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/aliases.cpython-312.pyc' import 'encodings.aliases' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dc184d0> import 'encodings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dbe7b30> # /usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8.cpython-312.pyc' import 'encodings.utf_8' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dc1aa50> import '_signal' # import '_abc' # import 'abc' # import 'io' # import '_stat' # import 'stat' # import '_collections_abc' # import 'genericpath' # import 'posixpath' # import 'os' # import '_sitebuiltins' # Processing user site-packages Processing global site-packages Adding directory: '/usr/lib64/python3.12/site-packages' Adding directory: '/usr/lib/python3.12/site-packages' Processing .pth file: '/usr/lib/python3.12/site-packages/distutils-precedence.pth' # /usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc matches /usr/lib64/python3.12/encodings/utf_8_sig.py # code object from '/usr/lib64/python3.12/encodings/__pycache__/utf_8_sig.cpython-312.pyc' import 'encodings.utf_8_sig' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da09130> # /usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/_distutils_hack/__init__.py # code object from '/usr/lib/python3.12/site-packages/_distutils_hack/__pycache__/__init__.cpython-312.pyc' import '_distutils_hack' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da09fa0> import 'site' # Python 3.12.5 (main, Aug 23 2024, 00:00:00) [GCC 14.2.1 20240801 (Red Hat 14.2.1-1)] on linux Type "help", "copyright", "credits" or "license" for more information. # /usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc matches /usr/lib64/python3.12/base64.py # code object from '/usr/lib64/python3.12/__pycache__/base64.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/re/__init__.py # code object from '/usr/lib64/python3.12/re/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc matches /usr/lib64/python3.12/enum.py # code object from '/usr/lib64/python3.12/__pycache__/enum.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/types.cpython-312.pyc matches /usr/lib64/python3.12/types.py # code object from '/usr/lib64/python3.12/__pycache__/types.cpython-312.pyc' import 'types' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da47e60> # /usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc matches /usr/lib64/python3.12/operator.py # code object from '/usr/lib64/python3.12/__pycache__/operator.cpython-312.pyc' import '_operator' # import 'operator' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da47f20> # /usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc matches /usr/lib64/python3.12/functools.py # code object from '/usr/lib64/python3.12/__pycache__/functools.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/collections/__init__.py # code object from '/usr/lib64/python3.12/collections/__pycache__/__init__.cpython-312.pyc' import 'itertools' # # /usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc matches /usr/lib64/python3.12/keyword.py # code object from '/usr/lib64/python3.12/__pycache__/keyword.cpython-312.pyc' import 'keyword' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da7f890> # /usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc matches /usr/lib64/python3.12/reprlib.py # code object from '/usr/lib64/python3.12/__pycache__/reprlib.cpython-312.pyc' import 'reprlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da7ff20> import '_collections' # import 'collections' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da5fb30> import '_functools' # import 'functools' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da5d250> import 'enum' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da45010> # /usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc matches /usr/lib64/python3.12/re/_compiler.py # code object from '/usr/lib64/python3.12/re/__pycache__/_compiler.cpython-312.pyc' import '_sre' # # /usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc matches /usr/lib64/python3.12/re/_parser.py # code object from '/usr/lib64/python3.12/re/__pycache__/_parser.cpython-312.pyc' # /usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc matches /usr/lib64/python3.12/re/_constants.py # code object from '/usr/lib64/python3.12/re/__pycache__/_constants.cpython-312.pyc' import 're._constants' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da9f800> import 're._parser' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da9e450> # /usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc matches /usr/lib64/python3.12/re/_casefix.py # code object from '/usr/lib64/python3.12/re/__pycache__/_casefix.cpython-312.pyc' import 're._casefix' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da5e120> import 're._compiler' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da9ccb0> # /usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc matches /usr/lib64/python3.12/copyreg.py # code object from '/usr/lib64/python3.12/__pycache__/copyreg.cpython-312.pyc' import 'copyreg' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dad4860> import 're' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da44290> # /usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc matches /usr/lib64/python3.12/struct.py # code object from '/usr/lib64/python3.12/__pycache__/struct.cpython-312.pyc' # extension module '_struct' loaded from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' # extension module '_struct' executed from '/usr/lib64/python3.12/lib-dynload/_struct.cpython-312-x86_64-linux-gnu.so' import '_struct' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6dad4d10> import 'struct' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dad4bc0> # extension module 'binascii' loaded from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' # extension module 'binascii' executed from '/usr/lib64/python3.12/lib-dynload/binascii.cpython-312-x86_64-linux-gnu.so' import 'binascii' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6dad4fb0> import 'base64' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6da42db0> # /usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/importlib/__init__.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc matches /usr/lib64/python3.12/warnings.py # code object from '/usr/lib64/python3.12/__pycache__/warnings.cpython-312.pyc' import 'warnings' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dad56a0> import 'importlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dad5370> import 'importlib.machinery' # # /usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc matches /usr/lib64/python3.12/importlib/_abc.py # code object from '/usr/lib64/python3.12/importlib/__pycache__/_abc.cpython-312.pyc' import 'importlib._abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dad65a0> import 'importlib.util' # import 'runpy' # # /usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc matches /usr/lib64/python3.12/shutil.py # code object from '/usr/lib64/python3.12/__pycache__/shutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc matches /usr/lib64/python3.12/fnmatch.py # code object from '/usr/lib64/python3.12/__pycache__/fnmatch.cpython-312.pyc' import 'fnmatch' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6daec7a0> import 'errno' # # extension module 'zlib' loaded from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' # extension module 'zlib' executed from '/usr/lib64/python3.12/lib-dynload/zlib.cpython-312-x86_64-linux-gnu.so' import 'zlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6daede80> # /usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc matches /usr/lib64/python3.12/bz2.py # code object from '/usr/lib64/python3.12/__pycache__/bz2.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc matches /usr/lib64/python3.12/_compression.py # code object from '/usr/lib64/python3.12/__pycache__/_compression.cpython-312.pyc' import '_compression' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6daeed20> # extension module '_bz2' loaded from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' # extension module '_bz2' executed from '/usr/lib64/python3.12/lib-dynload/_bz2.cpython-312-x86_64-linux-gnu.so' import '_bz2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6daef320> import 'bz2' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6daee270> # /usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc matches /usr/lib64/python3.12/lzma.py # code object from '/usr/lib64/python3.12/__pycache__/lzma.cpython-312.pyc' # extension module '_lzma' loaded from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' # extension module '_lzma' executed from '/usr/lib64/python3.12/lib-dynload/_lzma.cpython-312-x86_64-linux-gnu.so' import '_lzma' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6daefda0> import 'lzma' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6daef4d0> import 'shutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dad6510> # /usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc matches /usr/lib64/python3.12/tempfile.py # code object from '/usr/lib64/python3.12/__pycache__/tempfile.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/random.cpython-312.pyc matches /usr/lib64/python3.12/random.py # code object from '/usr/lib64/python3.12/__pycache__/random.cpython-312.pyc' # extension module 'math' loaded from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' # extension module 'math' executed from '/usr/lib64/python3.12/lib-dynload/math.cpython-312-x86_64-linux-gnu.so' import 'math' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d877bf0> # /usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc matches /usr/lib64/python3.12/bisect.py # code object from '/usr/lib64/python3.12/__pycache__/bisect.cpython-312.pyc' # extension module '_bisect' loaded from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' # extension module '_bisect' executed from '/usr/lib64/python3.12/lib-dynload/_bisect.cpython-312-x86_64-linux-gnu.so' import '_bisect' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d8a06b0> import 'bisect' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8a0410> # extension module '_random' loaded from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' # extension module '_random' executed from '/usr/lib64/python3.12/lib-dynload/_random.cpython-312-x86_64-linux-gnu.so' import '_random' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d8a06e0> # /usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc matches /usr/lib64/python3.12/hashlib.py # code object from '/usr/lib64/python3.12/__pycache__/hashlib.cpython-312.pyc' # extension module '_hashlib' loaded from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' # extension module '_hashlib' executed from '/usr/lib64/python3.12/lib-dynload/_hashlib.cpython-312-x86_64-linux-gnu.so' import '_hashlib' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d8a1010> # extension module '_blake2' loaded from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' # extension module '_blake2' executed from '/usr/lib64/python3.12/lib-dynload/_blake2.cpython-312-x86_64-linux-gnu.so' import '_blake2' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d8a19d0> import 'hashlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8a08c0> import 'random' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d875d90> # /usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc matches /usr/lib64/python3.12/weakref.py # code object from '/usr/lib64/python3.12/__pycache__/weakref.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc matches /usr/lib64/python3.12/_weakrefset.py # code object from '/usr/lib64/python3.12/__pycache__/_weakrefset.cpython-312.pyc' import '_weakrefset' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8a2d20> import 'weakref' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8a0e60> import 'tempfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6dad6750> # /usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/__init__.py # code object from '/usr/lib64/python3.12/zipfile/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc matches /usr/lib64/python3.12/threading.py # code object from '/usr/lib64/python3.12/__pycache__/threading.cpython-312.pyc' import 'threading' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8cf080> # /usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/__init__.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc matches /usr/lib64/python3.12/contextlib.py # code object from '/usr/lib64/python3.12/__pycache__/contextlib.cpython-312.pyc' import 'contextlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8ef440> # /usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc matches /usr/lib64/python3.12/pathlib.py # code object from '/usr/lib64/python3.12/__pycache__/pathlib.cpython-312.pyc' import 'ntpath' # # /usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/urllib/__init__.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/__init__.cpython-312.pyc' import 'urllib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d950260> # /usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc matches /usr/lib64/python3.12/urllib/parse.py # code object from '/usr/lib64/python3.12/urllib/__pycache__/parse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc matches /usr/lib64/python3.12/ipaddress.py # code object from '/usr/lib64/python3.12/__pycache__/ipaddress.cpython-312.pyc' import 'ipaddress' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d9529c0> import 'urllib.parse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d950380> import 'pathlib' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d91d250> # /usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc matches /usr/lib64/python3.12/zipfile/_path/glob.py # code object from '/usr/lib64/python3.12/zipfile/_path/__pycache__/glob.cpython-312.pyc' import 'zipfile._path.glob' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d755340> import 'zipfile._path' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8ee240> import 'zipfile' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8a3c50> # code object from '/usr/lib64/python3.12/encodings/cp437.pyc' import 'encodings.cp437' # <_frozen_importlib_external.SourcelessFileLoader object at 0x7f5a6d8ee840> # zipimport: found 30 names in '/tmp/ansible_stat_payload_3xbrm89g/ansible_stat_payload.zip' # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc matches /usr/lib64/python3.12/pkgutil.py # code object from '/usr/lib64/python3.12/__pycache__/pkgutil.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc matches /usr/lib64/python3.12/typing.py # code object from '/usr/lib64/python3.12/__pycache__/typing.cpython-312.pyc' # /usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc matches /usr/lib64/python3.12/collections/abc.py # code object from '/usr/lib64/python3.12/collections/__pycache__/abc.cpython-312.pyc' import 'collections.abc' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d7aafc0> import '_typing' # import 'typing' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d789eb0> import 'pkgutil' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d789070> # zipimport: zlib available import 'ansible' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc matches /usr/lib64/python3.12/__future__.py # code object from '/usr/lib64/python3.12/__pycache__/__future__.cpython-312.pyc' import '__future__' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d7a8e90> # /usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/json/__init__.py # code object from '/usr/lib64/python3.12/json/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc matches /usr/lib64/python3.12/json/decoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/decoder.cpython-312.pyc' # /usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc matches /usr/lib64/python3.12/json/scanner.py # code object from '/usr/lib64/python3.12/json/__pycache__/scanner.cpython-312.pyc' # extension module '_json' loaded from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' # extension module '_json' executed from '/usr/lib64/python3.12/lib-dynload/_json.cpython-312-x86_64-linux-gnu.so' import '_json' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d7d6900> import 'json.scanner' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d7d6690> import 'json.decoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d7d5fa0> # /usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc matches /usr/lib64/python3.12/json/encoder.py # code object from '/usr/lib64/python3.12/json/__pycache__/encoder.cpython-312.pyc' import 'json.encoder' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d7d63f0> import 'json' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d7abc50> import 'atexit' # # extension module 'grp' loaded from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' # extension module 'grp' executed from '/usr/lib64/python3.12/lib-dynload/grp.cpython-312-x86_64-linux-gnu.so' import 'grp' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d7d76b0> # extension module 'fcntl' loaded from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' # extension module 'fcntl' executed from '/usr/lib64/python3.12/lib-dynload/fcntl.cpython-312-x86_64-linux-gnu.so' import 'fcntl' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d7d78f0> # /usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc matches /usr/lib64/python3.12/locale.py # code object from '/usr/lib64/python3.12/__pycache__/locale.cpython-312.pyc' import '_locale' # import 'locale' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d7d7e30> import 'pwd' # # /usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc matches /usr/lib64/python3.12/platform.py # code object from '/usr/lib64/python3.12/__pycache__/platform.cpython-312.pyc' import 'platform' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d111c40> # extension module 'select' loaded from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' # extension module 'select' executed from '/usr/lib64/python3.12/lib-dynload/select.cpython-312-x86_64-linux-gnu.so' import 'select' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d113860> # /usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc matches /usr/lib64/python3.12/selectors.py # code object from '/usr/lib64/python3.12/__pycache__/selectors.cpython-312.pyc' import 'selectors' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d1141d0> # /usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc matches /usr/lib64/python3.12/shlex.py # code object from '/usr/lib64/python3.12/__pycache__/shlex.cpython-312.pyc' import 'shlex' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d115370> # /usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc matches /usr/lib64/python3.12/subprocess.py # code object from '/usr/lib64/python3.12/__pycache__/subprocess.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc matches /usr/lib64/python3.12/signal.py # code object from '/usr/lib64/python3.12/__pycache__/signal.cpython-312.pyc' import 'signal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d117e30> # extension module '_posixsubprocess' loaded from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' # extension module '_posixsubprocess' executed from '/usr/lib64/python3.12/lib-dynload/_posixsubprocess.cpython-312-x86_64-linux-gnu.so' import '_posixsubprocess' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d78b0b0> import 'subprocess' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d1160f0> # /usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc matches /usr/lib64/python3.12/traceback.py # code object from '/usr/lib64/python3.12/__pycache__/traceback.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc matches /usr/lib64/python3.12/linecache.py # code object from '/usr/lib64/python3.12/__pycache__/linecache.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc matches /usr/lib64/python3.12/tokenize.py # code object from '/usr/lib64/python3.12/__pycache__/tokenize.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/token.cpython-312.pyc matches /usr/lib64/python3.12/token.py # code object from '/usr/lib64/python3.12/__pycache__/token.cpython-312.pyc' import 'token' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d11fda0> import '_tokenize' # import 'tokenize' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d11e870> import 'linecache' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d11e5d0> # /usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc matches /usr/lib64/python3.12/textwrap.py # code object from '/usr/lib64/python3.12/__pycache__/textwrap.cpython-312.pyc' import 'textwrap' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d11eb40> import 'traceback' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d116600> # extension module 'syslog' loaded from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' # extension module 'syslog' executed from '/usr/lib64/python3.12/lib-dynload/syslog.cpython-312-x86_64-linux-gnu.so' import 'syslog' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d1679e0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/__init__.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/__init__.cpython-312.pyc' import 'systemd' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d168170> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/journal.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/journal.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc matches /usr/lib64/python3.12/datetime.py # code object from '/usr/lib64/python3.12/__pycache__/datetime.cpython-312.pyc' # extension module '_datetime' loaded from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' # extension module '_datetime' executed from '/usr/lib64/python3.12/lib-dynload/_datetime.cpython-312-x86_64-linux-gnu.so' import '_datetime' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d169bb0> import 'datetime' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d169970> # /usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc matches /usr/lib64/python3.12/uuid.py # code object from '/usr/lib64/python3.12/__pycache__/uuid.cpython-312.pyc' # extension module '_uuid' loaded from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' # extension module '_uuid' executed from '/usr/lib64/python3.12/lib-dynload/_uuid.cpython-312-x86_64-linux-gnu.so' import '_uuid' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d16c110> import 'uuid' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d16a2a0> # /usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/logging/__init__.py # code object from '/usr/lib64/python3.12/logging/__pycache__/__init__.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/string.cpython-312.pyc matches /usr/lib64/python3.12/string.py # code object from '/usr/lib64/python3.12/__pycache__/string.cpython-312.pyc' import '_string' # import 'string' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d16f860> import 'logging' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d16c230> # extension module 'systemd._journal' loaded from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._journal' executed from '/usr/lib64/python3.12/site-packages/systemd/_journal.cpython-312-x86_64-linux-gnu.so' import 'systemd._journal' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d170380> # extension module 'systemd._reader' loaded from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._reader' executed from '/usr/lib64/python3.12/site-packages/systemd/_reader.cpython-312-x86_64-linux-gnu.so' import 'systemd._reader' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d1709e0> # extension module 'systemd.id128' loaded from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd.id128' executed from '/usr/lib64/python3.12/site-packages/systemd/id128.cpython-312-x86_64-linux-gnu.so' import 'systemd.id128' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d170ad0> import 'systemd.journal' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d1682c0> # /usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/systemd/daemon.py # code object from '/usr/lib64/python3.12/site-packages/systemd/__pycache__/daemon.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc matches /usr/lib64/python3.12/socket.py # code object from '/usr/lib64/python3.12/__pycache__/socket.cpython-312.pyc' # extension module '_socket' loaded from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' # extension module '_socket' executed from '/usr/lib64/python3.12/lib-dynload/_socket.cpython-312-x86_64-linux-gnu.so' import '_socket' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d1fc0e0> # extension module 'array' loaded from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' # extension module 'array' executed from '/usr/lib64/python3.12/lib-dynload/array.cpython-312-x86_64-linux-gnu.so' import 'array' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d1fd250> import 'socket' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d172870> # extension module 'systemd._daemon' loaded from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' # extension module 'systemd._daemon' executed from '/usr/lib64/python3.12/site-packages/systemd/_daemon.cpython-312-x86_64-linux-gnu.so' import 'systemd._daemon' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d173c20> import 'systemd.daemon' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d1724b0> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.compat' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.six' # import 'ansible.module_utils.six.moves' # import 'ansible.module_utils.six.moves.collections_abc' # import 'ansible.module_utils.common.text.converters' # # /usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/__init__.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/__init__.cpython-312.pyc' # extension module '_ctypes' loaded from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' # extension module '_ctypes' executed from '/usr/lib64/python3.12/lib-dynload/_ctypes.cpython-312-x86_64-linux-gnu.so' import '_ctypes' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d001430> # /usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc matches /usr/lib64/python3.12/ctypes/_endian.py # code object from '/usr/lib64/python3.12/ctypes/__pycache__/_endian.cpython-312.pyc' import 'ctypes._endian' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d002210> import 'ctypes' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d1fd4f0> import 'ansible.module_utils.compat.selinux' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils._text' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc matches /usr/lib64/python3.12/copy.py # code object from '/usr/lib64/python3.12/__pycache__/copy.cpython-312.pyc' import 'copy' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d002240> # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.collections' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.warnings' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.errors' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.parsing.convert_bool' # # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc matches /usr/lib64/python3.12/ast.py # code object from '/usr/lib64/python3.12/__pycache__/ast.cpython-312.pyc' import '_ast' # import 'ast' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d003440> # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.text.formatters' # import 'ansible.module_utils.common.validation' # import 'ansible.module_utils.common.parameters' # import 'ansible.module_utils.common.arg_spec' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common.locale' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc matches /usr/lib64/python3.12/site-packages/selinux/__init__.py # code object from '/usr/lib64/python3.12/site-packages/selinux/__pycache__/__init__.cpython-312.pyc' # extension module 'selinux._selinux' loaded from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' # extension module 'selinux._selinux' executed from '/usr/lib64/python3.12/site-packages/selinux/_selinux.cpython-312-x86_64-linux-gnu.so' import 'selinux._selinux' # <_frozen_importlib_external.ExtensionFileLoader object at 0x7f5a6d00df70> import 'selinux' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d00bd70> import 'ansible.module_utils.common.file' # import 'ansible.module_utils.common.process' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available # /usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/__init__.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/__init__.cpython-312.pyc' # /usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc matches /usr/lib/python3.12/site-packages/distro/distro.py # code object from '/usr/lib/python3.12/site-packages/distro/__pycache__/distro.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc matches /usr/lib64/python3.12/argparse.py # code object from '/usr/lib64/python3.12/__pycache__/argparse.cpython-312.pyc' # /usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc matches /usr/lib64/python3.12/gettext.py # code object from '/usr/lib64/python3.12/__pycache__/gettext.cpython-312.pyc' import 'gettext' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d8269c0> import 'argparse' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d81a690> import 'distro.distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d00e0f0> import 'distro' # <_frozen_importlib_external.SourceFileLoader object at 0x7f5a6d004290> # destroy ansible.module_utils.distro import 'ansible.module_utils.distro' # # zipimport: zlib available # zipimport: zlib available import 'ansible.module_utils.common._utils' # import 'ansible.module_utils.common.sys_info' # import 'ansible.module_utils.basic' # # zipimport: zlib available # zipimport: zlib available import 'ansible.modules' # # zipimport: zlib available # zipimport: zlib available # zipimport: zlib available {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"path": "/run/ostree-booted", "follow": false, "get_checksum": true, "get_mime": true, "get_attributes": true, "checksum_algorithm": "sha1"}}} # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 4 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. [WARNING]: Module invocation had junk after the JSON data: # destroy __main__ # clear sys.path_importer_cache # clear sys.path_hooks # clear builtins._ # clear sys.path # clear sys.argv # clear sys.ps1 # clear sys.ps2 # clear sys.last_exc # clear sys.last_type # clear sys.last_value # clear sys.last_traceback # clear sys.__interactivehook__ # clear sys.meta_path # restore sys.stdin # restore sys.stdout # restore sys.stderr # cleanup[2] removing sys # cleanup[2] removing builtins # cleanup[2] removing _frozen_importlib # cleanup[2] removing _imp # cleanup[2] removing _thread # cleanup[2] removing _warnings # cleanup[2] removing _weakref # cleanup[2] removing _io # cleanup[2] removing marshal # cleanup[2] removing posix # cleanup[2] removing _frozen_importlib_external # cleanup[2] removing time # cleanup[2] removing zipimport # cleanup[2] removing _codecs # cleanup[2] removing codecs # cleanup[2] removing encodings.aliases # cleanup[2] removing encodings # cleanup[2] removing encodings.utf_8 # cleanup[2] removing _signal # cleanup[2] removing _abc # cleanup[2] removing abc # cleanup[2] removing io # cleanup[2] removing __main__ # cleanup[2] removing _stat # cleanup[2] removing stat # cleanup[2] removing _collections_abc # cleanup[2] removing genericpath # cleanup[2] removing posixpath # cleanup[2] removing os.path # cleanup[2] removing os # cleanup[2] removing _sitebuiltins # cleanup[2] removing encodings.utf_8_sig # cleanup[2] removing _distutils_hack # destroy _distutils_hack # cleanup[2] removing site # destroy site # cleanup[2] removing types # cleanup[2] removing _operator # cleanup[2] removing operator # cleanup[2] removing itertools # cleanup[2] removing keyword # destroy keyword # cleanup[2] removing reprlib # destroy reprlib # cleanup[2] removing _collections # cleanup[2] removing collections # cleanup[2] removing _functools # cleanup[2] removing functools # cleanup[2] removing enum # cleanup[2] removing _sre # cleanup[2] removing re._constants # cleanup[2] removing re._parser # cleanup[2] removing re._casefix # cleanup[2] removing re._compiler # cleanup[2] removing copyreg # cleanup[2] removing re # cleanup[2] removing _struct # cleanup[2] removing struct # cleanup[2] removing binascii # cleanup[2] removing base64 # destroy base64 # cleanup[2] removing importlib._bootstrap # cleanup[2] removing importlib._bootstrap_external # cleanup[2] removing warnings # cleanup[2] removing importlib # cleanup[2] removing importlib.machinery # cleanup[2] removing importlib._abc # cleanup[2] removing importlib.util # cleanup[2] removing runpy # destroy runpy # cleanup[2] removing fnmatch # cleanup[2] removing errno # cleanup[2] removing zlib # cleanup[2] removing _compression # cleanup[2] removing _bz2 # cleanup[2] removing bz2 # cleanup[2] removing _lzma # cleanup[2] removing lzma # cleanup[2] removing shutil # cleanup[2] removing math # cleanup[2] removing _bisect # cleanup[2] removing bisect # destroy bisect # cleanup[2] removing _random # cleanup[2] removing _hashlib # cleanup[2] removing _blake2 # cleanup[2] removing hashlib # cleanup[2] removing random # destroy random # cleanup[2] removing _weakrefset # destroy _weakrefset # cleanup[2] removing weakref # cleanup[2] removing tempfile # cleanup[2] removing threading # cleanup[2] removing contextlib # cleanup[2] removing ntpath # cleanup[2] removing urllib # destroy urllib # cleanup[2] removing ipaddress # cleanup[2] removing urllib.parse # destroy urllib.parse # cleanup[2] removing pathlib # cleanup[2] removing zipfile._path.glob # cleanup[2] removing zipfile._path # cleanup[2] removing zipfile # cleanup[2] removing encodings.cp437 # cleanup[2] removing collections.abc # cleanup[2] removing _typing # cleanup[2] removing typing # destroy typing # cleanup[2] removing pkgutil # destroy pkgutil # cleanup[2] removing ansible # destroy ansible # cleanup[2] removing ansible.module_utils # destroy ansible.module_utils # cleanup[2] removing __future__ # destroy __future__ # cleanup[2] removing _json # cleanup[2] removing json.scanner # cleanup[2] removing json.decoder # cleanup[2] removing json.encoder # cleanup[2] removing json # cleanup[2] removing atexit # cleanup[2] removing grp # cleanup[2] removing fcntl # cleanup[2] removing _locale # cleanup[2] removing locale # cleanup[2] removing pwd # cleanup[2] removing platform # cleanup[2] removing select # cleanup[2] removing selectors # cleanup[2] removing shlex # cleanup[2] removing signal # cleanup[2] removing _posixsubprocess # cleanup[2] removing subprocess # cleanup[2] removing token # destroy token # cleanup[2] removing _tokenize # cleanup[2] removing tokenize # cleanup[2] removing linecache # cleanup[2] removing textwrap # cleanup[2] removing traceback # cleanup[2] removing syslog # cleanup[2] removing systemd # destroy systemd # cleanup[2] removing _datetime # cleanup[2] removing datetime # cleanup[2] removing _uuid # cleanup[2] removing uuid # cleanup[2] removing _string # cleanup[2] removing string # destroy string # cleanup[2] removing logging # cleanup[2] removing systemd._journal # cleanup[2] removing systemd._reader # cleanup[2] removing systemd.id128 # cleanup[2] removing systemd.journal # cleanup[2] removing _socket # cleanup[2] removing array # cleanup[2] removing socket # destroy socket # cleanup[2] removing systemd._daemon # cleanup[2] removing systemd.daemon # cleanup[2] removing ansible.module_utils.compat # destroy ansible.module_utils.compat # cleanup[2] removing ansible.module_utils.common # destroy ansible.module_utils.common # cleanup[2] removing ansible.module_utils.common.text # destroy ansible.module_utils.common.text # cleanup[2] removing ansible.module_utils.six # destroy ansible.module_utils.six # cleanup[2] removing ansible.module_utils.six.moves # cleanup[2] removing ansible.module_utils.six.moves.collections_abc # cleanup[2] removing ansible.module_utils.common.text.converters # destroy ansible.module_utils.common.text.converters # cleanup[2] removing _ctypes # cleanup[2] removing ctypes._endian # cleanup[2] removing ctypes # destroy ctypes # cleanup[2] removing ansible.module_utils.compat.selinux # cleanup[2] removing ansible.module_utils._text # destroy ansible.module_utils._text # cleanup[2] removing copy # destroy copy # cleanup[2] removing ansible.module_utils.common.collections # destroy ansible.module_utils.common.collections # cleanup[2] removing ansible.module_utils.common.warnings # destroy ansible.module_utils.common.warnings # cleanup[2] removing ansible.module_utils.errors # destroy ansible.module_utils.errors # cleanup[2] removing ansible.module_utils.parsing # destroy ansible.module_utils.parsing # cleanup[2] removing ansible.module_utils.parsing.convert_bool # destroy ansible.module_utils.parsing.convert_bool # cleanup[2] removing _ast # destroy _ast # cleanup[2] removing ast # destroy ast # cleanup[2] removing ansible.module_utils.common.text.formatters # destroy ansible.module_utils.common.text.formatters # cleanup[2] removing ansible.module_utils.common.validation # destroy ansible.module_utils.common.validation # cleanup[2] removing ansible.module_utils.common.parameters # destroy ansible.module_utils.common.parameters # cleanup[2] removing ansible.module_utils.common.arg_spec # destroy ansible.module_utils.common.arg_spec # cleanup[2] removing ansible.module_utils.common.locale # destroy ansible.module_utils.common.locale # cleanup[2] removing swig_runtime_data4 # destroy swig_runtime_data4 # cleanup[2] removing selinux._selinux # cleanup[2] removing selinux # cleanup[2] removing ansible.module_utils.common.file # destroy ansible.module_utils.common.file # cleanup[2] removing ansible.module_utils.common.process # destroy ansible.module_utils.common.process # cleanup[2] removing gettext # destroy gettext # cleanup[2] removing argparse # cleanup[2] removing distro.distro # cleanup[2] removing distro # cleanup[2] removing ansible.module_utils.distro # cleanup[2] removing ansible.module_utils.common._utils # destroy ansible.module_utils.common._utils # cleanup[2] removing ansible.module_utils.common.sys_info # destroy ansible.module_utils.common.sys_info # cleanup[2] removing ansible.module_utils.basic # destroy ansible.module_utils.basic # cleanup[2] removing ansible.modules # destroy ansible.modules # destroy _sitebuiltins # destroy importlib.machinery # destroy importlib._abc # destroy importlib.util # destroy _bz2 # destroy _compression # destroy _lzma # destroy _blake2 # destroy binascii # destroy struct # destroy zlib # destroy bz2 # destroy lzma # destroy zipfile._path # destroy zipfile # destroy pathlib # destroy zipfile._path.glob # destroy fnmatch # destroy ipaddress # destroy ntpath # destroy importlib # destroy zipimport # destroy __main__ # destroy tempfile # destroy systemd.journal # destroy systemd.daemon # destroy ansible.module_utils.compat.selinux # destroy hashlib # destroy json.decoder # destroy json.encoder # destroy json.scanner # destroy _json # destroy grp # destroy encodings # destroy _locale # destroy pwd # destroy locale # destroy signal # destroy fcntl # destroy select # destroy _signal # destroy _posixsubprocess # destroy syslog # destroy uuid # destroy selectors # destroy errno # destroy array # destroy datetime # destroy selinux # destroy shutil # destroy distro # destroy distro.distro # destroy argparse # destroy json # destroy logging # destroy shlex # destroy subprocess # cleanup[3] wiping selinux._selinux # cleanup[3] wiping ctypes._endian # cleanup[3] wiping _ctypes # cleanup[3] wiping ansible.module_utils.six.moves.collections_abc # cleanup[3] wiping ansible.module_utils.six.moves # cleanup[3] wiping systemd._daemon # cleanup[3] wiping _socket # cleanup[3] wiping systemd.id128 # cleanup[3] wiping systemd._reader # cleanup[3] wiping systemd._journal # cleanup[3] wiping _string # cleanup[3] wiping _uuid # cleanup[3] wiping _datetime # cleanup[3] wiping traceback # destroy linecache # destroy textwrap # cleanup[3] wiping tokenize # cleanup[3] wiping _tokenize # cleanup[3] wiping platform # cleanup[3] wiping atexit # cleanup[3] wiping _typing # cleanup[3] wiping collections.abc # cleanup[3] wiping encodings.cp437 # cleanup[3] wiping contextlib # cleanup[3] wiping threading # cleanup[3] wiping weakref # cleanup[3] wiping _hashlib # cleanup[3] wiping _random # cleanup[3] wiping _bisect # cleanup[3] wiping math # cleanup[3] wiping warnings # cleanup[3] wiping importlib._bootstrap_external # cleanup[3] wiping importlib._bootstrap # cleanup[3] wiping _struct # cleanup[3] wiping re # destroy re._constants # destroy re._casefix # destroy re._compiler # destroy enum # cleanup[3] wiping copyreg # cleanup[3] wiping re._parser # cleanup[3] wiping _sre # cleanup[3] wiping functools # cleanup[3] wiping _functools # cleanup[3] wiping collections # destroy _collections_abc # destroy collections.abc # cleanup[3] wiping _collections # cleanup[3] wiping itertools # cleanup[3] wiping operator # cleanup[3] wiping _operator # cleanup[3] wiping types # cleanup[3] wiping encodings.utf_8_sig # cleanup[3] wiping os # destroy posixpath # cleanup[3] wiping genericpath # cleanup[3] wiping stat # cleanup[3] wiping _stat # destroy _stat # cleanup[3] wiping io # destroy abc # cleanup[3] wiping _abc # cleanup[3] wiping encodings.utf_8 # cleanup[3] wiping encodings.aliases # cleanup[3] wiping codecs # cleanup[3] wiping _codecs # cleanup[3] wiping time # cleanup[3] wiping _frozen_importlib_external # cleanup[3] wiping posix # cleanup[3] wiping marshal # cleanup[3] wiping _io # cleanup[3] wiping _weakref # cleanup[3] wiping _warnings # cleanup[3] wiping _thread # cleanup[3] wiping _imp # cleanup[3] wiping _frozen_importlib # cleanup[3] wiping sys # cleanup[3] wiping builtins # destroy selinux._selinux # destroy systemd._daemon # destroy systemd.id128 # destroy systemd._reader # destroy systemd._journal # destroy _datetime # destroy sys.monitoring # destroy _socket # destroy _collections # destroy platform # destroy _uuid # destroy stat # destroy genericpath # destroy re._parser # destroy tokenize # destroy ansible.module_utils.six.moves.urllib # destroy copyreg # destroy contextlib # destroy _typing # destroy _tokenize # destroy ansible.module_utils.six.moves.urllib_parse # destroy ansible.module_utils.six.moves.urllib.error # destroy ansible.module_utils.six.moves.urllib.request # destroy ansible.module_utils.six.moves.urllib.response # destroy ansible.module_utils.six.moves.urllib.robotparser # destroy functools # destroy operator # destroy ansible.module_utils.six.moves # destroy _frozen_importlib_external # destroy _imp # destroy _io # destroy marshal # clear sys.meta_path # clear sys.modules # destroy _frozen_importlib # destroy codecs # destroy encodings.aliases # destroy encodings.utf_8 # destroy encodings.utf_8_sig # destroy encodings.cp437 # destroy _codecs # destroy io # destroy traceback # destroy warnings # destroy weakref # destroy collections # destroy threading # destroy atexit # destroy _warnings # destroy math # destroy _bisect # destroy time # destroy _random # destroy _weakref # destroy _hashlib # destroy _operator # destroy _string # destroy re # destroy itertools # destroy _abc # destroy _sre # destroy posix # destroy _functools # destroy builtins # destroy _thread # clear sys.audit hooks 11579 1726882173.77768: done with _execute_module (stat, {'path': '/run/ostree-booted', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882173.1819508-11663-135177153972299/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882173.77771: _low_level_execute_command(): starting 11579 1726882173.77773: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882173.1819508-11663-135177153972299/ > /dev/null 2>&1 && sleep 0' 11579 1726882173.78313: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882173.78324: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882173.78399: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882173.78403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882173.78405: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882173.78407: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882173.78416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882173.78419: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882173.78561: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882173.78569: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882173.78661: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882173.80451: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882173.80567: stderr chunk (state=3): >>><<< 11579 1726882173.80570: stdout chunk (state=3): >>><<< 11579 1726882173.80701: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882173.80710: handler run complete 11579 1726882173.80713: attempt loop complete, returning result 11579 1726882173.80815: _execute() done 11579 1726882173.80820: dumping result to json 11579 1726882173.80828: done dumping result, returning 11579 1726882173.80831: done running TaskExecutor() for managed_node1/TASK: Check if system is ostree [12673a56-9f93-f197-7423-0000000000df] 11579 1726882173.80834: sending task result for task 12673a56-9f93-f197-7423-0000000000df 11579 1726882173.80895: done sending task result for task 12673a56-9f93-f197-7423-0000000000df ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11579 1726882173.80959: no more pending results, returning what we have 11579 1726882173.80962: results queue empty 11579 1726882173.80963: checking for any_errors_fatal 11579 1726882173.80967: done checking for any_errors_fatal 11579 1726882173.80968: checking for max_fail_percentage 11579 1726882173.80969: done checking for max_fail_percentage 11579 1726882173.80970: checking to see if all hosts have failed and the running result is not ok 11579 1726882173.80971: done checking to see if all hosts have failed 11579 1726882173.80971: getting the remaining hosts for this loop 11579 1726882173.80972: done getting the remaining hosts for this loop 11579 1726882173.80975: getting the next task for host managed_node1 11579 1726882173.80981: done getting next task for host managed_node1 11579 1726882173.80983: ^ task is: TASK: Set flag to indicate system is ostree 11579 1726882173.80985: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882173.80989: getting variables 11579 1726882173.80991: in VariableManager get_vars() 11579 1726882173.81024: Calling all_inventory to load vars for managed_node1 11579 1726882173.81027: Calling groups_inventory to load vars for managed_node1 11579 1726882173.81031: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882173.81045: Calling all_plugins_play to load vars for managed_node1 11579 1726882173.81048: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882173.81052: Calling groups_plugins_play to load vars for managed_node1 11579 1726882173.81428: WORKER PROCESS EXITING 11579 1726882173.81451: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882173.81779: done with get_vars() 11579 1726882173.81789: done getting variables 11579 1726882173.81998: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Set flag to indicate system is ostree] *********************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:22 Friday 20 September 2024 21:29:33 -0400 (0:00:00.729) 0:00:02.528 ****** 11579 1726882173.82028: entering _queue_task() for managed_node1/set_fact 11579 1726882173.82030: Creating lock for set_fact 11579 1726882173.82640: worker is 1 (out of 1 available) 11579 1726882173.82649: exiting _queue_task() for managed_node1/set_fact 11579 1726882173.82659: done queuing things up, now waiting for results queue to drain 11579 1726882173.82660: waiting for pending results... 11579 1726882173.83009: running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree 11579 1726882173.83142: in run() - task 12673a56-9f93-f197-7423-0000000000e0 11579 1726882173.83276: variable 'ansible_search_path' from source: unknown 11579 1726882173.83280: variable 'ansible_search_path' from source: unknown 11579 1726882173.83319: calling self._execute() 11579 1726882173.83504: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882173.83511: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882173.83519: variable 'omit' from source: magic vars 11579 1726882173.84686: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882173.85116: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882173.85401: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882173.85405: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882173.85407: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882173.85644: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882173.85785: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882173.85821: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882173.85848: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882173.86083: Evaluated conditional (not __network_is_ostree is defined): True 11579 1726882173.86092: variable 'omit' from source: magic vars 11579 1726882173.86249: variable 'omit' from source: magic vars 11579 1726882173.86559: variable '__ostree_booted_stat' from source: set_fact 11579 1726882173.86712: variable 'omit' from source: magic vars 11579 1726882173.86752: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882173.86828: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882173.86850: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882173.86876: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882173.86897: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882173.86936: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882173.86952: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882173.86972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882173.87082: Set connection var ansible_timeout to 10 11579 1726882173.87089: Set connection var ansible_shell_type to sh 11579 1726882173.87192: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882173.87199: Set connection var ansible_shell_executable to /bin/sh 11579 1726882173.87203: Set connection var ansible_pipelining to False 11579 1726882173.87205: Set connection var ansible_connection to ssh 11579 1726882173.87207: variable 'ansible_shell_executable' from source: unknown 11579 1726882173.87210: variable 'ansible_connection' from source: unknown 11579 1726882173.87213: variable 'ansible_module_compression' from source: unknown 11579 1726882173.87215: variable 'ansible_shell_type' from source: unknown 11579 1726882173.87218: variable 'ansible_shell_executable' from source: unknown 11579 1726882173.87221: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882173.87224: variable 'ansible_pipelining' from source: unknown 11579 1726882173.87226: variable 'ansible_timeout' from source: unknown 11579 1726882173.87229: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882173.87323: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882173.87346: variable 'omit' from source: magic vars 11579 1726882173.87356: starting attempt loop 11579 1726882173.87363: running the handler 11579 1726882173.87379: handler run complete 11579 1726882173.87392: attempt loop complete, returning result 11579 1726882173.87498: _execute() done 11579 1726882173.87505: dumping result to json 11579 1726882173.87508: done dumping result, returning 11579 1726882173.87513: done running TaskExecutor() for managed_node1/TASK: Set flag to indicate system is ostree [12673a56-9f93-f197-7423-0000000000e0] 11579 1726882173.87516: sending task result for task 12673a56-9f93-f197-7423-0000000000e0 11579 1726882173.87585: done sending task result for task 12673a56-9f93-f197-7423-0000000000e0 11579 1726882173.87589: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "__network_is_ostree": false }, "changed": false } 11579 1726882173.87685: no more pending results, returning what we have 11579 1726882173.87687: results queue empty 11579 1726882173.87688: checking for any_errors_fatal 11579 1726882173.87695: done checking for any_errors_fatal 11579 1726882173.87696: checking for max_fail_percentage 11579 1726882173.87698: done checking for max_fail_percentage 11579 1726882173.87698: checking to see if all hosts have failed and the running result is not ok 11579 1726882173.87699: done checking to see if all hosts have failed 11579 1726882173.87700: getting the remaining hosts for this loop 11579 1726882173.87701: done getting the remaining hosts for this loop 11579 1726882173.87704: getting the next task for host managed_node1 11579 1726882173.87711: done getting next task for host managed_node1 11579 1726882173.87714: ^ task is: TASK: Fix CentOS6 Base repo 11579 1726882173.87716: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882173.87720: getting variables 11579 1726882173.87722: in VariableManager get_vars() 11579 1726882173.87767: Calling all_inventory to load vars for managed_node1 11579 1726882173.87769: Calling groups_inventory to load vars for managed_node1 11579 1726882173.87772: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882173.87781: Calling all_plugins_play to load vars for managed_node1 11579 1726882173.87784: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882173.87791: Calling groups_plugins_play to load vars for managed_node1 11579 1726882173.88255: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882173.88608: done with get_vars() 11579 1726882173.88758: done getting variables 11579 1726882173.88998: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Fix CentOS6 Base repo] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:26 Friday 20 September 2024 21:29:33 -0400 (0:00:00.069) 0:00:02.598 ****** 11579 1726882173.89026: entering _queue_task() for managed_node1/copy 11579 1726882173.89656: worker is 1 (out of 1 available) 11579 1726882173.89669: exiting _queue_task() for managed_node1/copy 11579 1726882173.89680: done queuing things up, now waiting for results queue to drain 11579 1726882173.89681: waiting for pending results... 11579 1726882173.90336: running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo 11579 1726882173.90345: in run() - task 12673a56-9f93-f197-7423-0000000000e2 11579 1726882173.90582: variable 'ansible_search_path' from source: unknown 11579 1726882173.90585: variable 'ansible_search_path' from source: unknown 11579 1726882173.90588: calling self._execute() 11579 1726882173.90738: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882173.90743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882173.90755: variable 'omit' from source: magic vars 11579 1726882173.91903: variable 'ansible_distribution' from source: facts 11579 1726882173.91920: Evaluated conditional (ansible_distribution == 'CentOS'): True 11579 1726882173.92320: variable 'ansible_distribution_major_version' from source: facts 11579 1726882173.92324: Evaluated conditional (ansible_distribution_major_version == '6'): False 11579 1726882173.92327: when evaluation is False, skipping this task 11579 1726882173.92334: _execute() done 11579 1726882173.92337: dumping result to json 11579 1726882173.92340: done dumping result, returning 11579 1726882173.92343: done running TaskExecutor() for managed_node1/TASK: Fix CentOS6 Base repo [12673a56-9f93-f197-7423-0000000000e2] 11579 1726882173.92348: sending task result for task 12673a56-9f93-f197-7423-0000000000e2 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11579 1726882173.93033: no more pending results, returning what we have 11579 1726882173.93037: results queue empty 11579 1726882173.93038: checking for any_errors_fatal 11579 1726882173.93043: done checking for any_errors_fatal 11579 1726882173.93044: checking for max_fail_percentage 11579 1726882173.93046: done checking for max_fail_percentage 11579 1726882173.93046: checking to see if all hosts have failed and the running result is not ok 11579 1726882173.93048: done checking to see if all hosts have failed 11579 1726882173.93049: getting the remaining hosts for this loop 11579 1726882173.93051: done getting the remaining hosts for this loop 11579 1726882173.93054: getting the next task for host managed_node1 11579 1726882173.93062: done getting next task for host managed_node1 11579 1726882173.93064: ^ task is: TASK: Include the task 'enable_epel.yml' 11579 1726882173.93067: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882173.93071: getting variables 11579 1726882173.93073: in VariableManager get_vars() 11579 1726882173.93108: Calling all_inventory to load vars for managed_node1 11579 1726882173.93111: Calling groups_inventory to load vars for managed_node1 11579 1726882173.93116: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882173.93130: Calling all_plugins_play to load vars for managed_node1 11579 1726882173.93133: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882173.93136: Calling groups_plugins_play to load vars for managed_node1 11579 1726882173.93798: done sending task result for task 12673a56-9f93-f197-7423-0000000000e2 11579 1726882173.93801: WORKER PROCESS EXITING 11579 1726882173.94213: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882173.95204: done with get_vars() 11579 1726882173.95215: done getting variables TASK [Include the task 'enable_epel.yml'] ************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:51 Friday 20 September 2024 21:29:33 -0400 (0:00:00.062) 0:00:02.661 ****** 11579 1726882173.95311: entering _queue_task() for managed_node1/include_tasks 11579 1726882173.97186: worker is 1 (out of 1 available) 11579 1726882173.97216: exiting _queue_task() for managed_node1/include_tasks 11579 1726882173.97256: done queuing things up, now waiting for results queue to drain 11579 1726882173.97258: waiting for pending results... 11579 1726882173.97511: running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' 11579 1726882173.97718: in run() - task 12673a56-9f93-f197-7423-0000000000e3 11579 1726882173.97722: variable 'ansible_search_path' from source: unknown 11579 1726882173.97810: variable 'ansible_search_path' from source: unknown 11579 1726882173.97814: calling self._execute() 11579 1726882173.97954: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882173.97959: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882173.97970: variable 'omit' from source: magic vars 11579 1726882173.99173: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882174.04680: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882174.04989: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882174.04996: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882174.04999: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882174.05001: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882174.05299: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882174.05603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882174.05607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882174.05874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882174.05878: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882174.06302: variable '__network_is_ostree' from source: set_fact 11579 1726882174.06307: Evaluated conditional (not __network_is_ostree | d(false)): True 11579 1726882174.06310: _execute() done 11579 1726882174.06313: dumping result to json 11579 1726882174.06315: done dumping result, returning 11579 1726882174.06317: done running TaskExecutor() for managed_node1/TASK: Include the task 'enable_epel.yml' [12673a56-9f93-f197-7423-0000000000e3] 11579 1726882174.06319: sending task result for task 12673a56-9f93-f197-7423-0000000000e3 11579 1726882174.06540: no more pending results, returning what we have 11579 1726882174.06546: in VariableManager get_vars() 11579 1726882174.06582: Calling all_inventory to load vars for managed_node1 11579 1726882174.06585: Calling groups_inventory to load vars for managed_node1 11579 1726882174.06589: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882174.06608: Calling all_plugins_play to load vars for managed_node1 11579 1726882174.06612: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882174.06616: Calling groups_plugins_play to load vars for managed_node1 11579 1726882174.07412: done sending task result for task 12673a56-9f93-f197-7423-0000000000e3 11579 1726882174.07415: WORKER PROCESS EXITING 11579 1726882174.07439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882174.08209: done with get_vars() 11579 1726882174.08220: variable 'ansible_search_path' from source: unknown 11579 1726882174.08221: variable 'ansible_search_path' from source: unknown 11579 1726882174.08264: we have included files to process 11579 1726882174.08265: generating all_blocks data 11579 1726882174.08267: done generating all_blocks data 11579 1726882174.08609: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11579 1726882174.08611: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11579 1726882174.08615: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml 11579 1726882174.10541: done processing included file 11579 1726882174.10544: iterating over new_blocks loaded from include file 11579 1726882174.10545: in VariableManager get_vars() 11579 1726882174.10558: done with get_vars() 11579 1726882174.10560: filtering new block on tags 11579 1726882174.10919: done filtering new block on tags 11579 1726882174.10923: in VariableManager get_vars() 11579 1726882174.10935: done with get_vars() 11579 1726882174.10937: filtering new block on tags 11579 1726882174.10948: done filtering new block on tags 11579 1726882174.10950: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml for managed_node1 11579 1726882174.10957: extending task lists for all hosts with included blocks 11579 1726882174.11274: done extending task lists 11579 1726882174.11275: done processing included files 11579 1726882174.11276: results queue empty 11579 1726882174.11277: checking for any_errors_fatal 11579 1726882174.11280: done checking for any_errors_fatal 11579 1726882174.11281: checking for max_fail_percentage 11579 1726882174.11282: done checking for max_fail_percentage 11579 1726882174.11282: checking to see if all hosts have failed and the running result is not ok 11579 1726882174.11283: done checking to see if all hosts have failed 11579 1726882174.11284: getting the remaining hosts for this loop 11579 1726882174.11287: done getting the remaining hosts for this loop 11579 1726882174.11289: getting the next task for host managed_node1 11579 1726882174.11445: done getting next task for host managed_node1 11579 1726882174.11448: ^ task is: TASK: Create EPEL {{ ansible_distribution_major_version }} 11579 1726882174.11451: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882174.11453: getting variables 11579 1726882174.11454: in VariableManager get_vars() 11579 1726882174.11463: Calling all_inventory to load vars for managed_node1 11579 1726882174.11465: Calling groups_inventory to load vars for managed_node1 11579 1726882174.11468: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882174.11473: Calling all_plugins_play to load vars for managed_node1 11579 1726882174.11480: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882174.11483: Calling groups_plugins_play to load vars for managed_node1 11579 1726882174.11868: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882174.12575: done with get_vars() 11579 1726882174.12583: done getting variables 11579 1726882174.12771: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11579 1726882174.13607: variable 'ansible_distribution_major_version' from source: facts TASK [Create EPEL 10] ********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:8 Friday 20 September 2024 21:29:34 -0400 (0:00:00.183) 0:00:02.844 ****** 11579 1726882174.13655: entering _queue_task() for managed_node1/command 11579 1726882174.13657: Creating lock for command 11579 1726882174.14553: worker is 1 (out of 1 available) 11579 1726882174.14789: exiting _queue_task() for managed_node1/command 11579 1726882174.14802: done queuing things up, now waiting for results queue to drain 11579 1726882174.14804: waiting for pending results... 11579 1726882174.14995: running TaskExecutor() for managed_node1/TASK: Create EPEL 10 11579 1726882174.15207: in run() - task 12673a56-9f93-f197-7423-0000000000fd 11579 1726882174.15503: variable 'ansible_search_path' from source: unknown 11579 1726882174.15507: variable 'ansible_search_path' from source: unknown 11579 1726882174.15510: calling self._execute() 11579 1726882174.15539: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882174.15551: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882174.15568: variable 'omit' from source: magic vars 11579 1726882174.16385: variable 'ansible_distribution' from source: facts 11579 1726882174.16406: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11579 1726882174.16537: variable 'ansible_distribution_major_version' from source: facts 11579 1726882174.16798: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11579 1726882174.16803: when evaluation is False, skipping this task 11579 1726882174.16806: _execute() done 11579 1726882174.16808: dumping result to json 11579 1726882174.16811: done dumping result, returning 11579 1726882174.16814: done running TaskExecutor() for managed_node1/TASK: Create EPEL 10 [12673a56-9f93-f197-7423-0000000000fd] 11579 1726882174.16816: sending task result for task 12673a56-9f93-f197-7423-0000000000fd skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11579 1726882174.16944: no more pending results, returning what we have 11579 1726882174.16947: results queue empty 11579 1726882174.16948: checking for any_errors_fatal 11579 1726882174.16949: done checking for any_errors_fatal 11579 1726882174.16949: checking for max_fail_percentage 11579 1726882174.16951: done checking for max_fail_percentage 11579 1726882174.16951: checking to see if all hosts have failed and the running result is not ok 11579 1726882174.16953: done checking to see if all hosts have failed 11579 1726882174.16953: getting the remaining hosts for this loop 11579 1726882174.16955: done getting the remaining hosts for this loop 11579 1726882174.16958: getting the next task for host managed_node1 11579 1726882174.16965: done getting next task for host managed_node1 11579 1726882174.16968: ^ task is: TASK: Install yum-utils package 11579 1726882174.16972: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882174.16975: getting variables 11579 1726882174.16976: in VariableManager get_vars() 11579 1726882174.17011: Calling all_inventory to load vars for managed_node1 11579 1726882174.17014: Calling groups_inventory to load vars for managed_node1 11579 1726882174.17018: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882174.17033: Calling all_plugins_play to load vars for managed_node1 11579 1726882174.17036: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882174.17040: Calling groups_plugins_play to load vars for managed_node1 11579 1726882174.17644: done sending task result for task 12673a56-9f93-f197-7423-0000000000fd 11579 1726882174.17648: WORKER PROCESS EXITING 11579 1726882174.17765: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882174.18254: done with get_vars() 11579 1726882174.18264: done getting variables 11579 1726882174.18699: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Install yum-utils package] *********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:26 Friday 20 September 2024 21:29:34 -0400 (0:00:00.050) 0:00:02.895 ****** 11579 1726882174.18731: entering _queue_task() for managed_node1/package 11579 1726882174.18733: Creating lock for package 11579 1726882174.19637: worker is 1 (out of 1 available) 11579 1726882174.19648: exiting _queue_task() for managed_node1/package 11579 1726882174.19658: done queuing things up, now waiting for results queue to drain 11579 1726882174.19659: waiting for pending results... 11579 1726882174.19908: running TaskExecutor() for managed_node1/TASK: Install yum-utils package 11579 1726882174.20315: in run() - task 12673a56-9f93-f197-7423-0000000000fe 11579 1726882174.20499: variable 'ansible_search_path' from source: unknown 11579 1726882174.20502: variable 'ansible_search_path' from source: unknown 11579 1726882174.20504: calling self._execute() 11579 1726882174.20507: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882174.20509: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882174.20511: variable 'omit' from source: magic vars 11579 1726882174.21132: variable 'ansible_distribution' from source: facts 11579 1726882174.21149: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11579 1726882174.21387: variable 'ansible_distribution_major_version' from source: facts 11579 1726882174.21517: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11579 1726882174.21526: when evaluation is False, skipping this task 11579 1726882174.21532: _execute() done 11579 1726882174.21539: dumping result to json 11579 1726882174.21546: done dumping result, returning 11579 1726882174.21555: done running TaskExecutor() for managed_node1/TASK: Install yum-utils package [12673a56-9f93-f197-7423-0000000000fe] 11579 1726882174.21566: sending task result for task 12673a56-9f93-f197-7423-0000000000fe skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11579 1726882174.21716: no more pending results, returning what we have 11579 1726882174.21720: results queue empty 11579 1726882174.21720: checking for any_errors_fatal 11579 1726882174.21726: done checking for any_errors_fatal 11579 1726882174.21726: checking for max_fail_percentage 11579 1726882174.21728: done checking for max_fail_percentage 11579 1726882174.21728: checking to see if all hosts have failed and the running result is not ok 11579 1726882174.21729: done checking to see if all hosts have failed 11579 1726882174.21730: getting the remaining hosts for this loop 11579 1726882174.21731: done getting the remaining hosts for this loop 11579 1726882174.21734: getting the next task for host managed_node1 11579 1726882174.21741: done getting next task for host managed_node1 11579 1726882174.21742: ^ task is: TASK: Enable EPEL 7 11579 1726882174.21746: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882174.21749: getting variables 11579 1726882174.21750: in VariableManager get_vars() 11579 1726882174.21782: Calling all_inventory to load vars for managed_node1 11579 1726882174.21785: Calling groups_inventory to load vars for managed_node1 11579 1726882174.21789: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882174.21804: Calling all_plugins_play to load vars for managed_node1 11579 1726882174.21807: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882174.21810: Calling groups_plugins_play to load vars for managed_node1 11579 1726882174.22187: done sending task result for task 12673a56-9f93-f197-7423-0000000000fe 11579 1726882174.22191: WORKER PROCESS EXITING 11579 1726882174.22443: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882174.23201: done with get_vars() 11579 1726882174.23212: done getting variables 11579 1726882174.23267: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 7] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:32 Friday 20 September 2024 21:29:34 -0400 (0:00:00.045) 0:00:02.942 ****** 11579 1726882174.23415: entering _queue_task() for managed_node1/command 11579 1726882174.23870: worker is 1 (out of 1 available) 11579 1726882174.23881: exiting _queue_task() for managed_node1/command 11579 1726882174.24098: done queuing things up, now waiting for results queue to drain 11579 1726882174.24100: waiting for pending results... 11579 1726882174.24320: running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 11579 1726882174.24799: in run() - task 12673a56-9f93-f197-7423-0000000000ff 11579 1726882174.24803: variable 'ansible_search_path' from source: unknown 11579 1726882174.24805: variable 'ansible_search_path' from source: unknown 11579 1726882174.24808: calling self._execute() 11579 1726882174.24922: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882174.24935: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882174.24950: variable 'omit' from source: magic vars 11579 1726882174.25737: variable 'ansible_distribution' from source: facts 11579 1726882174.25998: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11579 1726882174.26140: variable 'ansible_distribution_major_version' from source: facts 11579 1726882174.26151: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11579 1726882174.26159: when evaluation is False, skipping this task 11579 1726882174.26165: _execute() done 11579 1726882174.26171: dumping result to json 11579 1726882174.26178: done dumping result, returning 11579 1726882174.26187: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 7 [12673a56-9f93-f197-7423-0000000000ff] 11579 1726882174.26200: sending task result for task 12673a56-9f93-f197-7423-0000000000ff skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11579 1726882174.26354: no more pending results, returning what we have 11579 1726882174.26357: results queue empty 11579 1726882174.26358: checking for any_errors_fatal 11579 1726882174.26364: done checking for any_errors_fatal 11579 1726882174.26365: checking for max_fail_percentage 11579 1726882174.26367: done checking for max_fail_percentage 11579 1726882174.26368: checking to see if all hosts have failed and the running result is not ok 11579 1726882174.26369: done checking to see if all hosts have failed 11579 1726882174.26370: getting the remaining hosts for this loop 11579 1726882174.26371: done getting the remaining hosts for this loop 11579 1726882174.26375: getting the next task for host managed_node1 11579 1726882174.26381: done getting next task for host managed_node1 11579 1726882174.26384: ^ task is: TASK: Enable EPEL 8 11579 1726882174.26387: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882174.26390: getting variables 11579 1726882174.26392: in VariableManager get_vars() 11579 1726882174.26426: Calling all_inventory to load vars for managed_node1 11579 1726882174.26429: Calling groups_inventory to load vars for managed_node1 11579 1726882174.26433: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882174.26447: Calling all_plugins_play to load vars for managed_node1 11579 1726882174.26451: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882174.26454: Calling groups_plugins_play to load vars for managed_node1 11579 1726882174.27049: done sending task result for task 12673a56-9f93-f197-7423-0000000000ff 11579 1726882174.27053: WORKER PROCESS EXITING 11579 1726882174.27059: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882174.27559: done with get_vars() 11579 1726882174.27568: done getting variables 11579 1726882174.27683: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 8] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:37 Friday 20 September 2024 21:29:34 -0400 (0:00:00.043) 0:00:02.986 ****** 11579 1726882174.27777: entering _queue_task() for managed_node1/command 11579 1726882174.28273: worker is 1 (out of 1 available) 11579 1726882174.28283: exiting _queue_task() for managed_node1/command 11579 1726882174.28520: done queuing things up, now waiting for results queue to drain 11579 1726882174.28522: waiting for pending results... 11579 1726882174.28603: running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 11579 1726882174.28763: in run() - task 12673a56-9f93-f197-7423-000000000100 11579 1726882174.28786: variable 'ansible_search_path' from source: unknown 11579 1726882174.28795: variable 'ansible_search_path' from source: unknown 11579 1726882174.28835: calling self._execute() 11579 1726882174.28919: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882174.28932: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882174.28947: variable 'omit' from source: magic vars 11579 1726882174.29329: variable 'ansible_distribution' from source: facts 11579 1726882174.29346: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11579 1726882174.29478: variable 'ansible_distribution_major_version' from source: facts 11579 1726882174.29489: Evaluated conditional (ansible_distribution_major_version in ['7', '8']): False 11579 1726882174.29508: when evaluation is False, skipping this task 11579 1726882174.29515: _execute() done 11579 1726882174.29522: dumping result to json 11579 1726882174.29529: done dumping result, returning 11579 1726882174.29539: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 8 [12673a56-9f93-f197-7423-000000000100] 11579 1726882174.29548: sending task result for task 12673a56-9f93-f197-7423-000000000100 11579 1726882174.29677: done sending task result for task 12673a56-9f93-f197-7423-000000000100 11579 1726882174.29680: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version in ['7', '8']", "skip_reason": "Conditional result was False" } 11579 1726882174.29762: no more pending results, returning what we have 11579 1726882174.29766: results queue empty 11579 1726882174.29767: checking for any_errors_fatal 11579 1726882174.29771: done checking for any_errors_fatal 11579 1726882174.29772: checking for max_fail_percentage 11579 1726882174.29773: done checking for max_fail_percentage 11579 1726882174.29774: checking to see if all hosts have failed and the running result is not ok 11579 1726882174.29775: done checking to see if all hosts have failed 11579 1726882174.29776: getting the remaining hosts for this loop 11579 1726882174.29778: done getting the remaining hosts for this loop 11579 1726882174.29781: getting the next task for host managed_node1 11579 1726882174.29790: done getting next task for host managed_node1 11579 1726882174.29792: ^ task is: TASK: Enable EPEL 6 11579 1726882174.29798: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882174.29801: getting variables 11579 1726882174.29803: in VariableManager get_vars() 11579 1726882174.29912: Calling all_inventory to load vars for managed_node1 11579 1726882174.29915: Calling groups_inventory to load vars for managed_node1 11579 1726882174.29919: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882174.29929: Calling all_plugins_play to load vars for managed_node1 11579 1726882174.29938: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882174.29941: Calling groups_plugins_play to load vars for managed_node1 11579 1726882174.30606: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882174.30821: done with get_vars() 11579 1726882174.30830: done getting variables 11579 1726882174.30881: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Enable EPEL 6] *********************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/enable_epel.yml:42 Friday 20 September 2024 21:29:34 -0400 (0:00:00.032) 0:00:03.018 ****** 11579 1726882174.31043: entering _queue_task() for managed_node1/copy 11579 1726882174.31533: worker is 1 (out of 1 available) 11579 1726882174.31545: exiting _queue_task() for managed_node1/copy 11579 1726882174.31555: done queuing things up, now waiting for results queue to drain 11579 1726882174.31556: waiting for pending results... 11579 1726882174.32041: running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 11579 1726882174.32116: in run() - task 12673a56-9f93-f197-7423-000000000102 11579 1726882174.32124: variable 'ansible_search_path' from source: unknown 11579 1726882174.32171: variable 'ansible_search_path' from source: unknown 11579 1726882174.32178: calling self._execute() 11579 1726882174.32257: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882174.32269: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882174.32290: variable 'omit' from source: magic vars 11579 1726882174.32731: variable 'ansible_distribution' from source: facts 11579 1726882174.32769: Evaluated conditional (ansible_distribution in ['RedHat', 'CentOS']): True 11579 1726882174.32880: variable 'ansible_distribution_major_version' from source: facts 11579 1726882174.32932: Evaluated conditional (ansible_distribution_major_version == '6'): False 11579 1726882174.32936: when evaluation is False, skipping this task 11579 1726882174.32938: _execute() done 11579 1726882174.32940: dumping result to json 11579 1726882174.32941: done dumping result, returning 11579 1726882174.32944: done running TaskExecutor() for managed_node1/TASK: Enable EPEL 6 [12673a56-9f93-f197-7423-000000000102] 11579 1726882174.32945: sending task result for task 12673a56-9f93-f197-7423-000000000102 skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version == '6'", "skip_reason": "Conditional result was False" } 11579 1726882174.33264: no more pending results, returning what we have 11579 1726882174.33268: results queue empty 11579 1726882174.33269: checking for any_errors_fatal 11579 1726882174.33274: done checking for any_errors_fatal 11579 1726882174.33275: checking for max_fail_percentage 11579 1726882174.33276: done checking for max_fail_percentage 11579 1726882174.33277: checking to see if all hosts have failed and the running result is not ok 11579 1726882174.33278: done checking to see if all hosts have failed 11579 1726882174.33279: getting the remaining hosts for this loop 11579 1726882174.33280: done getting the remaining hosts for this loop 11579 1726882174.33283: getting the next task for host managed_node1 11579 1726882174.33290: done getting next task for host managed_node1 11579 1726882174.33294: ^ task is: TASK: Set network provider to 'nm' 11579 1726882174.33297: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882174.33300: getting variables 11579 1726882174.33301: in VariableManager get_vars() 11579 1726882174.33325: Calling all_inventory to load vars for managed_node1 11579 1726882174.33327: Calling groups_inventory to load vars for managed_node1 11579 1726882174.33330: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882174.33339: Calling all_plugins_play to load vars for managed_node1 11579 1726882174.33342: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882174.33345: Calling groups_plugins_play to load vars for managed_node1 11579 1726882174.33551: done sending task result for task 12673a56-9f93-f197-7423-000000000102 11579 1726882174.33555: WORKER PROCESS EXITING 11579 1726882174.33576: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882174.33974: done with get_vars() 11579 1726882174.33982: done getting variables 11579 1726882174.34147: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set network provider to 'nm'] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:13 Friday 20 September 2024 21:29:34 -0400 (0:00:00.031) 0:00:03.050 ****** 11579 1726882174.34171: entering _queue_task() for managed_node1/set_fact 11579 1726882174.35277: worker is 1 (out of 1 available) 11579 1726882174.35290: exiting _queue_task() for managed_node1/set_fact 11579 1726882174.35300: done queuing things up, now waiting for results queue to drain 11579 1726882174.35302: waiting for pending results... 11579 1726882174.35583: running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' 11579 1726882174.35795: in run() - task 12673a56-9f93-f197-7423-000000000007 11579 1726882174.35799: variable 'ansible_search_path' from source: unknown 11579 1726882174.35888: calling self._execute() 11579 1726882174.36101: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882174.36104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882174.36108: variable 'omit' from source: magic vars 11579 1726882174.36355: variable 'omit' from source: magic vars 11579 1726882174.36557: variable 'omit' from source: magic vars 11579 1726882174.36561: variable 'omit' from source: magic vars 11579 1726882174.36690: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882174.36884: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882174.36915: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882174.36943: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882174.37009: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882174.37062: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882174.37251: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882174.37255: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882174.37519: Set connection var ansible_timeout to 10 11579 1726882174.37551: Set connection var ansible_shell_type to sh 11579 1726882174.37588: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882174.37660: Set connection var ansible_shell_executable to /bin/sh 11579 1726882174.37673: Set connection var ansible_pipelining to False 11579 1726882174.37691: Set connection var ansible_connection to ssh 11579 1726882174.37782: variable 'ansible_shell_executable' from source: unknown 11579 1726882174.37998: variable 'ansible_connection' from source: unknown 11579 1726882174.38002: variable 'ansible_module_compression' from source: unknown 11579 1726882174.38006: variable 'ansible_shell_type' from source: unknown 11579 1726882174.38008: variable 'ansible_shell_executable' from source: unknown 11579 1726882174.38010: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882174.38012: variable 'ansible_pipelining' from source: unknown 11579 1726882174.38014: variable 'ansible_timeout' from source: unknown 11579 1726882174.38016: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882174.38427: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882174.38431: variable 'omit' from source: magic vars 11579 1726882174.38434: starting attempt loop 11579 1726882174.38436: running the handler 11579 1726882174.38439: handler run complete 11579 1726882174.38441: attempt loop complete, returning result 11579 1726882174.38443: _execute() done 11579 1726882174.38445: dumping result to json 11579 1726882174.38446: done dumping result, returning 11579 1726882174.38448: done running TaskExecutor() for managed_node1/TASK: Set network provider to 'nm' [12673a56-9f93-f197-7423-000000000007] 11579 1726882174.38450: sending task result for task 12673a56-9f93-f197-7423-000000000007 11579 1726882174.38520: done sending task result for task 12673a56-9f93-f197-7423-000000000007 11579 1726882174.38523: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "network_provider": "nm" }, "changed": false } 11579 1726882174.38583: no more pending results, returning what we have 11579 1726882174.38586: results queue empty 11579 1726882174.38587: checking for any_errors_fatal 11579 1726882174.38594: done checking for any_errors_fatal 11579 1726882174.38595: checking for max_fail_percentage 11579 1726882174.38597: done checking for max_fail_percentage 11579 1726882174.38598: checking to see if all hosts have failed and the running result is not ok 11579 1726882174.38599: done checking to see if all hosts have failed 11579 1726882174.38599: getting the remaining hosts for this loop 11579 1726882174.38601: done getting the remaining hosts for this loop 11579 1726882174.38605: getting the next task for host managed_node1 11579 1726882174.38611: done getting next task for host managed_node1 11579 1726882174.38614: ^ task is: TASK: meta (flush_handlers) 11579 1726882174.38616: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882174.38620: getting variables 11579 1726882174.38622: in VariableManager get_vars() 11579 1726882174.38654: Calling all_inventory to load vars for managed_node1 11579 1726882174.38657: Calling groups_inventory to load vars for managed_node1 11579 1726882174.38662: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882174.38672: Calling all_plugins_play to load vars for managed_node1 11579 1726882174.38676: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882174.38679: Calling groups_plugins_play to load vars for managed_node1 11579 1726882174.39257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882174.40345: done with get_vars() 11579 1726882174.40354: done getting variables 11579 1726882174.40642: in VariableManager get_vars() 11579 1726882174.40651: Calling all_inventory to load vars for managed_node1 11579 1726882174.40653: Calling groups_inventory to load vars for managed_node1 11579 1726882174.40655: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882174.40659: Calling all_plugins_play to load vars for managed_node1 11579 1726882174.40661: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882174.40664: Calling groups_plugins_play to load vars for managed_node1 11579 1726882174.41179: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882174.41743: done with get_vars() 11579 1726882174.41756: done queuing things up, now waiting for results queue to drain 11579 1726882174.41762: results queue empty 11579 1726882174.41780: checking for any_errors_fatal 11579 1726882174.41782: done checking for any_errors_fatal 11579 1726882174.41783: checking for max_fail_percentage 11579 1726882174.41784: done checking for max_fail_percentage 11579 1726882174.41785: checking to see if all hosts have failed and the running result is not ok 11579 1726882174.41786: done checking to see if all hosts have failed 11579 1726882174.41786: getting the remaining hosts for this loop 11579 1726882174.41787: done getting the remaining hosts for this loop 11579 1726882174.41790: getting the next task for host managed_node1 11579 1726882174.41795: done getting next task for host managed_node1 11579 1726882174.41797: ^ task is: TASK: meta (flush_handlers) 11579 1726882174.41798: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882174.41922: getting variables 11579 1726882174.41924: in VariableManager get_vars() 11579 1726882174.41932: Calling all_inventory to load vars for managed_node1 11579 1726882174.41935: Calling groups_inventory to load vars for managed_node1 11579 1726882174.41937: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882174.41941: Calling all_plugins_play to load vars for managed_node1 11579 1726882174.41944: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882174.41947: Calling groups_plugins_play to load vars for managed_node1 11579 1726882174.42157: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882174.42588: done with get_vars() 11579 1726882174.42598: done getting variables 11579 1726882174.42640: in VariableManager get_vars() 11579 1726882174.42649: Calling all_inventory to load vars for managed_node1 11579 1726882174.42651: Calling groups_inventory to load vars for managed_node1 11579 1726882174.42653: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882174.42657: Calling all_plugins_play to load vars for managed_node1 11579 1726882174.42660: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882174.42662: Calling groups_plugins_play to load vars for managed_node1 11579 1726882174.43017: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882174.43474: done with get_vars() 11579 1726882174.43485: done queuing things up, now waiting for results queue to drain 11579 1726882174.43486: results queue empty 11579 1726882174.43487: checking for any_errors_fatal 11579 1726882174.43488: done checking for any_errors_fatal 11579 1726882174.43489: checking for max_fail_percentage 11579 1726882174.43490: done checking for max_fail_percentage 11579 1726882174.43491: checking to see if all hosts have failed and the running result is not ok 11579 1726882174.43491: done checking to see if all hosts have failed 11579 1726882174.43492: getting the remaining hosts for this loop 11579 1726882174.43495: done getting the remaining hosts for this loop 11579 1726882174.43498: getting the next task for host managed_node1 11579 1726882174.43500: done getting next task for host managed_node1 11579 1726882174.43501: ^ task is: None 11579 1726882174.43502: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882174.43503: done queuing things up, now waiting for results queue to drain 11579 1726882174.43504: results queue empty 11579 1726882174.43505: checking for any_errors_fatal 11579 1726882174.43505: done checking for any_errors_fatal 11579 1726882174.43506: checking for max_fail_percentage 11579 1726882174.43507: done checking for max_fail_percentage 11579 1726882174.43508: checking to see if all hosts have failed and the running result is not ok 11579 1726882174.43508: done checking to see if all hosts have failed 11579 1726882174.43510: getting the next task for host managed_node1 11579 1726882174.43512: done getting next task for host managed_node1 11579 1726882174.43513: ^ task is: None 11579 1726882174.43514: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882174.43782: in VariableManager get_vars() 11579 1726882174.43808: done with get_vars() 11579 1726882174.43815: in VariableManager get_vars() 11579 1726882174.43830: done with get_vars() 11579 1726882174.43834: variable 'omit' from source: magic vars 11579 1726882174.43865: in VariableManager get_vars() 11579 1726882174.44001: done with get_vars() 11579 1726882174.44024: variable 'omit' from source: magic vars PLAY [Play for testing bond connection] **************************************** 11579 1726882174.45421: Loading StrategyModule 'linear' from /usr/local/lib/python3.12/site-packages/ansible/plugins/strategy/linear.py (found_in_cache=True, class_only=False) 11579 1726882174.45526: getting the remaining hosts for this loop 11579 1726882174.45527: done getting the remaining hosts for this loop 11579 1726882174.45530: getting the next task for host managed_node1 11579 1726882174.45533: done getting next task for host managed_node1 11579 1726882174.45535: ^ task is: TASK: Gathering Facts 11579 1726882174.45536: ^ state is: HOST STATE: block=0, task=0, rescue=0, always=0, handlers=0, run_state=0, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=True, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882174.45538: getting variables 11579 1726882174.45538: in VariableManager get_vars() 11579 1726882174.45550: Calling all_inventory to load vars for managed_node1 11579 1726882174.45552: Calling groups_inventory to load vars for managed_node1 11579 1726882174.45554: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882174.45559: Calling all_plugins_play to load vars for managed_node1 11579 1726882174.45571: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882174.45574: Calling groups_plugins_play to load vars for managed_node1 11579 1726882174.45980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882174.46172: done with get_vars() 11579 1726882174.46180: done getting variables 11579 1726882174.46530: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Gathering Facts] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 Friday 20 September 2024 21:29:34 -0400 (0:00:00.123) 0:00:03.173 ****** 11579 1726882174.46551: entering _queue_task() for managed_node1/gather_facts 11579 1726882174.47102: worker is 1 (out of 1 available) 11579 1726882174.47113: exiting _queue_task() for managed_node1/gather_facts 11579 1726882174.47122: done queuing things up, now waiting for results queue to drain 11579 1726882174.47123: waiting for pending results... 11579 1726882174.47621: running TaskExecutor() for managed_node1/TASK: Gathering Facts 11579 1726882174.47956: in run() - task 12673a56-9f93-f197-7423-000000000128 11579 1726882174.47961: variable 'ansible_search_path' from source: unknown 11579 1726882174.47964: calling self._execute() 11579 1726882174.48034: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882174.48198: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882174.48203: variable 'omit' from source: magic vars 11579 1726882174.48888: variable 'ansible_distribution_major_version' from source: facts 11579 1726882174.48938: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882174.49140: variable 'omit' from source: magic vars 11579 1726882174.49144: variable 'omit' from source: magic vars 11579 1726882174.49146: variable 'omit' from source: magic vars 11579 1726882174.49301: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882174.49304: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882174.49328: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882174.49375: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882174.49424: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882174.49459: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882174.49628: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882174.49632: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882174.49753: Set connection var ansible_timeout to 10 11579 1726882174.49808: Set connection var ansible_shell_type to sh 11579 1726882174.49821: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882174.49831: Set connection var ansible_shell_executable to /bin/sh 11579 1726882174.49953: Set connection var ansible_pipelining to False 11579 1726882174.49955: Set connection var ansible_connection to ssh 11579 1726882174.49957: variable 'ansible_shell_executable' from source: unknown 11579 1726882174.49959: variable 'ansible_connection' from source: unknown 11579 1726882174.49961: variable 'ansible_module_compression' from source: unknown 11579 1726882174.49963: variable 'ansible_shell_type' from source: unknown 11579 1726882174.49965: variable 'ansible_shell_executable' from source: unknown 11579 1726882174.49967: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882174.49969: variable 'ansible_pipelining' from source: unknown 11579 1726882174.49971: variable 'ansible_timeout' from source: unknown 11579 1726882174.49973: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882174.50362: Loading ActionModule 'gather_facts' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/gather_facts.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882174.50402: variable 'omit' from source: magic vars 11579 1726882174.50499: starting attempt loop 11579 1726882174.50502: running the handler 11579 1726882174.50505: variable 'ansible_facts' from source: unknown 11579 1726882174.50506: _low_level_execute_command(): starting 11579 1726882174.50557: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882174.52206: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882174.52224: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882174.52309: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882174.52380: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882174.52544: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882174.52588: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882174.54116: stdout chunk (state=3): >>>/root <<< 11579 1726882174.54251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882174.54262: stdout chunk (state=3): >>><<< 11579 1726882174.54359: stderr chunk (state=3): >>><<< 11579 1726882174.54363: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882174.54470: _low_level_execute_command(): starting 11579 1726882174.54474: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882174.5434391-11755-141765771927238 `" && echo ansible-tmp-1726882174.5434391-11755-141765771927238="` echo /root/.ansible/tmp/ansible-tmp-1726882174.5434391-11755-141765771927238 `" ) && sleep 0' 11579 1726882174.55607: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882174.55790: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882174.56028: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882174.56067: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882174.57899: stdout chunk (state=3): >>>ansible-tmp-1726882174.5434391-11755-141765771927238=/root/.ansible/tmp/ansible-tmp-1726882174.5434391-11755-141765771927238 <<< 11579 1726882174.58064: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882174.58068: stdout chunk (state=3): >>><<< 11579 1726882174.58072: stderr chunk (state=3): >>><<< 11579 1726882174.58096: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882174.5434391-11755-141765771927238=/root/.ansible/tmp/ansible-tmp-1726882174.5434391-11755-141765771927238 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882174.58506: variable 'ansible_module_compression' from source: unknown 11579 1726882174.58509: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.setup-ZIP_DEFLATED 11579 1726882174.58512: variable 'ansible_facts' from source: unknown 11579 1726882174.58782: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882174.5434391-11755-141765771927238/AnsiballZ_setup.py 11579 1726882174.59299: Sending initial data 11579 1726882174.59303: Sent initial data (154 bytes) 11579 1726882174.60479: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882174.60500: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882174.60589: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882174.60721: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882174.60743: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882174.60816: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882174.62410: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882174.62452: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882174.62504: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmplwty6por /root/.ansible/tmp/ansible-tmp-1726882174.5434391-11755-141765771927238/AnsiballZ_setup.py <<< 11579 1726882174.62514: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882174.5434391-11755-141765771927238/AnsiballZ_setup.py" <<< 11579 1726882174.62553: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmplwty6por" to remote "/root/.ansible/tmp/ansible-tmp-1726882174.5434391-11755-141765771927238/AnsiballZ_setup.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882174.5434391-11755-141765771927238/AnsiballZ_setup.py" <<< 11579 1726882174.65153: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882174.65295: stderr chunk (state=3): >>><<< 11579 1726882174.65306: stdout chunk (state=3): >>><<< 11579 1726882174.65450: done transferring module to remote 11579 1726882174.65453: _low_level_execute_command(): starting 11579 1726882174.65456: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882174.5434391-11755-141765771927238/ /root/.ansible/tmp/ansible-tmp-1726882174.5434391-11755-141765771927238/AnsiballZ_setup.py && sleep 0' 11579 1726882174.66608: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882174.66661: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882174.66678: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882174.66701: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882174.66768: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882174.66922: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882174.66959: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882174.67018: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882174.67088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882174.68827: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882174.69133: stderr chunk (state=3): >>><<< 11579 1726882174.69140: stdout chunk (state=3): >>><<< 11579 1726882174.69143: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882174.69145: _low_level_execute_command(): starting 11579 1726882174.69147: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882174.5434391-11755-141765771927238/AnsiballZ_setup.py && sleep 0' 11579 1726882174.70208: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882174.70439: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882174.70442: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882174.70458: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882174.70486: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882174.70666: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882175.47254: stdout chunk (state=3): >>> {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2977, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 554, "free": 2977}, "nocache": {"free": 3310, "used": 221}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 608, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793505280, "block_size": 4096, "block_total": 65519099, "block_available": 63914430, "block_used": 1604669, "inode_total": 131070960, "inode_available": 131029069, "inode_used": 41891, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.3759765625, "5m": 0.177734375, "15m": 0.08837890625}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "35", "epoch": "1726882175", "epoch_int": "1726882175", "date": "2024-09-20", "time": "21:29:35", "iso8601_micro": "2024-09-21T01:29:35.465068Z", "iso8601": "2024-09-21T01:29:35Z", "iso8601_basic": "20240920T212935465068", "iso8601_basic_short": "20240920T212935", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} <<< 11579 1726882175.50191: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882175.50200: stderr chunk (state=3): >>><<< 11579 1726882175.50203: stdout chunk (state=3): >>><<< 11579 1726882175.50207: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"ansible_system": "Linux", "ansible_kernel": "6.11.0-0.rc6.23.el10.x86_64", "ansible_kernel_version": "#1 SMP PREEMPT_DYNAMIC Tue Sep 3 21:42:57 UTC 2024", "ansible_machine": "x86_64", "ansible_python_version": "3.12.5", "ansible_fqdn": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_hostname": "ip-10-31-9-159", "ansible_nodename": "ip-10-31-9-159.us-east-1.aws.redhat.com", "ansible_domain": "us-east-1.aws.redhat.com", "ansible_userspace_bits": "64", "ansible_architecture": "x86_64", "ansible_userspace_architecture": "x86_64", "ansible_machine_id": "ec2d2d02cced42c36436217cb93f6b8e", "ansible_processor": ["0", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz", "1", "GenuineIntel", "Intel(R) Xeon(R) CPU E5-2666 v3 @ 2.90GHz"], "ansible_processor_count": 1, "ansible_processor_cores": 1, "ansible_processor_threads_per_core": 2, "ansible_processor_vcpus": 2, "ansible_processor_nproc": 2, "ansible_memtotal_mb": 3531, "ansible_memfree_mb": 2977, "ansible_swaptotal_mb": 0, "ansible_swapfree_mb": 0, "ansible_memory_mb": {"real": {"total": 3531, "used": 554, "free": 2977}, "nocache": {"free": 3310, "used": 221}, "swap": {"total": 0, "free": 0, "used": 0, "cached": 0}}, "ansible_bios_date": "08/24/2006", "ansible_bios_vendor": "Xen", "ansible_bios_version": "4.11.amazon", "ansible_board_asset_tag": "NA", "ansible_board_name": "NA", "ansible_board_serial": "NA", "ansible_board_vendor": "NA", "ansible_board_version": "NA", "ansible_chassis_asset_tag": "NA", "ansible_chassis_serial": "NA", "ansible_chassis_vendor": "Xen", "ansible_chassis_version": "NA", "ansible_form_factor": "Other", "ansible_product_name": "HVM domU", "ansible_product_serial": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_uuid": "ec2d2d02-cced-42c3-6436-217cb93f6b8e", "ansible_product_version": "4.11.amazon", "ansible_system_vendor": "Xen", "ansible_devices": {"xvda": {"virtual": 1, "links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "vendor": null, "model": null, "sas_address": null, "sas_device_handle": null, "removable": "0", "support_discard": "512", "partitions": {"xvda2": {"links": {"ids": [], "uuids": ["4853857d-d3e2-44a6-8739-3837607c97e1"], "labels": [], "masters": []}, "start": "4096", "sectors": "524283871", "sectorsize": 512, "size": "250.00 GB", "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1", "holders": []}, "xvda1": {"links": {"ids": [], "uuids": [], "labels": [], "masters": []}, "start": "2048", "sectors": "2048", "sectorsize": 512, "size": "1.00 MB", "uuid": null, "holders": []}}, "rotational": "0", "scheduler_mode": "mq-deadline", "sectors": "524288000", "sectorsize": "512", "size": "250.00 GB", "host": "", "holders": []}}, "ansible_device_links": {"ids": {}, "uuids": {"xvda2": ["4853857d-d3e2-44a6-8739-3837607c97e1"]}, "labels": {}, "masters": {}}, "ansible_uptime_seconds": 608, "ansible_lvm": {"lvs": {}, "vgs": {}, "pvs": {}}, "ansible_mounts": [{"mount": "/", "device": "/dev/xvda2", "fstype": "xfs", "options": "rw,seclabel,relatime,attr2,inode64,logbufs=8,logbsize=32k,noquota", "dump": 0, "passno": 0, "size_total": 268366229504, "size_available": 261793505280, "block_size": 4096, "block_total": 65519099, "block_available": 63914430, "block_used": 1604669, "inode_total": 131070960, "inode_available": 131029069, "inode_used": 41891, "uuid": "4853857d-d3e2-44a6-8739-3837607c97e1"}], "ansible_user_id": "root", "ansible_user_uid": 0, "ansible_user_gid": 0, "ansible_user_gecos": "Super User", "ansible_user_dir": "/root", "ansible_user_shell": "/bin/bash", "ansible_real_user_id": 0, "ansible_effective_user_id": 0, "ansible_real_group_id": 0, "ansible_effective_group_id": 0, "ansible_virtualization_type": "xen", "ansible_virtualization_role": "guest", "ansible_virtualization_tech_guest": ["xen"], "ansible_virtualization_tech_host": [], "ansible_distribution": "CentOS", "ansible_distribution_release": "Stream", "ansible_distribution_version": "10", "ansible_distribution_major_version": "10", "ansible_distribution_file_path": "/etc/centos-release", "ansible_distribution_file_variety": "CentOS", "ansible_distribution_file_parsed": true, "ansible_os_family": "RedHat", "ansible_env": {"SHELL": "/bin/bash", "GPG_TTY": "/dev/pts/0", "PWD": "/root", "LOGNAME": "root", "XDG_SESSION_TYPE": "tty", "_": "/usr/bin/python3.12", "MOTD_SHOWN": "pam", "HOME": "/root", "LANG": "en_US.UTF-8", "LS_COLORS": "", "SSH_CONNECTION": "10.31.11.248 52586 10.31.9.159 22", "XDG_SESSION_CLASS": "user", "SELINUX_ROLE_REQUESTED": "", "LESSOPEN": "||/usr/bin/lesspipe.sh %s", "USER": "root", "SELINUX_USE_CURRENT_RANGE": "", "SHLVL": "1", "XDG_SESSION_ID": "5", "XDG_RUNTIME_DIR": "/run/user/0", "SSH_CLIENT": "10.31.11.248 52586 22", "DEBUGINFOD_URLS": "https://debuginfod.centos.org/ ", "PATH": "/root/.local/bin:/root/bin:/usr/local/sbin:/usr/local/bin:/usr/sbin:/usr/bin", "SELINUX_LEVEL_REQUESTED": "", "DBUS_SESSION_BUS_ADDRESS": "unix:path=/run/user/0/bus", "SSH_TTY": "/dev/pts/0"}, "ansible_service_mgr": "systemd", "ansible_system_capabilities_enforced": "False", "ansible_system_capabilities": [], "ansible_is_chroot": false, "ansible_hostnqn": "nqn.2014-08.org.nvmexpress:uuid:66b8343d-0be9-40f0-836f-5213a2c1e120", "ansible_dns": {"search": ["us-east-1.aws.redhat.com"], "nameservers": ["10.29.169.13", "10.29.170.12", "10.2.32.1"]}, "ansible_selinux_python_present": true, "ansible_selinux": {"status": "enabled", "policyvers": 33, "config_mode": "enforcing", "mode": "enforcing", "type": "targeted"}, "ansible_local": {}, "ansible_interfaces": ["eth0", "lo"], "ansible_lo": {"device": "lo", "mtu": 65536, "active": true, "type": "loopback", "promisc": false, "ipv4": {"address": "127.0.0.1", "broadcast": "", "netmask": "255.0.0.0", "network": "127.0.0.0", "prefix": "8"}, "ipv6": [{"address": "::1", "prefix": "128", "scope": "host"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "off [fixed]", "tx_checksum_ip_generic": "on [fixed]", "tx_checksum_ipv6": "off [fixed]", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "on [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on [fixed]", "tx_scatter_gather_fraglist": "on [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "on", "tx_tcp_mangleid_segmentation": "on", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "on [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "on [fixed]", "tx_lockless": "on [fixed]", "netns_local": "on [fixed]", "tx_gso_robust": "off [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "on", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "on", "tx_gso_list": "on", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off [fixed]", "loopback": "on [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_eth0": {"device": "eth0", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "active": true, "module": "xen_netfront", "type": "ether", "pciid": "vif-0", "promisc": false, "ipv4": {"address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22"}, "ipv6": [{"address": "fe80::1030:bff:fea1:4223", "prefix": "64", "scope": "link"}], "features": {"rx_checksumming": "on [fixed]", "tx_checksumming": "on", "tx_checksum_ipv4": "on [fixed]", "tx_checksum_ip_generic": "off [fixed]", "tx_checksum_ipv6": "on", "tx_checksum_fcoe_crc": "off [fixed]", "tx_checksum_sctp": "off [fixed]", "scatter_gather": "on", "tx_scatter_gather": "on", "tx_scatter_gather_fraglist": "off [fixed]", "tcp_segmentation_offload": "on", "tx_tcp_segmentation": "on", "tx_tcp_ecn_segmentation": "off [fixed]", "tx_tcp_mangleid_segmentation": "off", "tx_tcp6_segmentation": "on", "generic_segmentation_offload": "on", "generic_receive_offload": "on", "large_receive_offload": "off [fixed]", "rx_vlan_offload": "off [fixed]", "tx_vlan_offload": "off [fixed]", "ntuple_filters": "off [fixed]", "receive_hashing": "off [fixed]", "highdma": "off [fixed]", "rx_vlan_filter": "off [fixed]", "vlan_challenged": "off [fixed]", "tx_lockless": "off [fixed]", "netns_local": "off [fixed]", "tx_gso_robust": "on [fixed]", "tx_fcoe_segmentation": "off [fixed]", "tx_gre_segmentation": "off [fixed]", "tx_gre_csum_segmentation": "off [fixed]", "tx_ipxip4_segmentation": "off [fixed]", "tx_ipxip6_segmentation": "off [fixed]", "tx_udp_tnl_segmentation": "off [fixed]", "tx_udp_tnl_csum_segmentation": "off [fixed]", "tx_gso_partial": "off [fixed]", "tx_tunnel_remcsum_segmentation": "off [fixed]", "tx_sctp_segmentation": "off [fixed]", "tx_esp_segmentation": "off [fixed]", "tx_udp_segmentation": "off [fixed]", "tx_gso_list": "off [fixed]", "fcoe_mtu": "off [fixed]", "tx_nocache_copy": "off", "loopback": "off [fixed]", "rx_fcs": "off [fixed]", "rx_all": "off [fixed]", "tx_vlan_stag_hw_insert": "off [fixed]", "rx_vlan_stag_hw_parse": "off [fixed]", "rx_vlan_stag_filter": "off [fixed]", "l2_fwd_offload": "off [fixed]", "hw_tc_offload": "off [fixed]", "esp_hw_offload": "off [fixed]", "esp_tx_csum_hw_offload": "off [fixed]", "rx_udp_tunnel_port_offload": "off [fixed]", "tls_hw_tx_offload": "off [fixed]", "tls_hw_rx_offload": "off [fixed]", "rx_gro_hw": "off [fixed]", "tls_hw_record": "off [fixed]", "rx_gro_list": "off", "macsec_hw_offload": "off [fixed]", "rx_udp_gro_forwarding": "off", "hsr_tag_ins_offload": "off [fixed]", "hsr_tag_rm_offload": "off [fixed]", "hsr_fwd_offload": "off [fixed]", "hsr_dup_offload": "off [fixed]"}, "timestamping": [], "hw_timestamp_filters": []}, "ansible_default_ipv4": {"gateway": "10.31.8.1", "interface": "eth0", "address": "10.31.9.159", "broadcast": "10.31.11.255", "netmask": "255.255.252.0", "network": "10.31.8.0", "prefix": "22", "macaddress": "12:30:0b:a1:42:23", "mtu": 9001, "type": "ether", "alias": "eth0"}, "ansible_default_ipv6": {}, "ansible_all_ipv4_addresses": ["10.31.9.159"], "ansible_all_ipv6_addresses": ["fe80::1030:bff:fea1:4223"], "ansible_locally_reachable_ips": {"ipv4": ["10.31.9.159", "127.0.0.0/8", "127.0.0.1"], "ipv6": ["::1", "fe80::1030:bff:fea1:4223"]}, "ansible_lsb": {}, "ansible_ssh_host_key_rsa_public": "AAAAB3NzaC1yc2EAAAADAQABAAABgQC9sgyYGKGPd0JFIDKIZZNkcX78Ca8OmX4GnOCt150Ftpgzzfir9Dy2HOb7d6QbQheoi9HLkHb66U2LDdt7EnBGKnI12YAuydTDfqITc2L4W9cEeoy/f2rrMlBo6FN3SNQc2voCDsWius2gK2mtTTZZI0R33PguMmqTkwYVzP0hYplwSYh5Atl+XP7/xLRhhowanh9U6x2ahqfnNq5DInqi070bKk0xZ2g12Vg8kIRno8ZQmm+ujUUevRkZysHvnrnN01ZQhqzjo/Awn+Pft6LYleTBn+YU/HlPMWR4PsFcrtT3WRdF5samSvVwWuuOC+0td2zQN4nGpYLK+FmpNG4nDfGZV/xIBBblNRvzrhKgk3lDU5qkeQ/R0godRQGbv4J1kq+3WU2E3upqBYxXWUJLM5FirAxz8tKLmaPh8YZWMKcs3X9F2ySLEcnhe5R5F6LFSNx13zQSt7lGZOIgzhvWllcs4YVxcV1Y4rTJ8jEK2KgWua+bZinJPYUJqKTzO2E=", "ansible_ssh_host_key_rsa_public_keytype": "ssh-rsa", "ansible_ssh_host_key_ecdsa_public": "AAAAE2VjZHNhLXNoYTItbmlzdHAyNTYAAAAIbmlzdHAyNTYAAABBBKk0X8hfHP7BSAAI8BDwrr4175ddN6MsanEqlp3oVMOvThKVXLpFXhvJPbq2IBTd3Wm12dL2vAW7/82zG63KYZk=", "ansible_ssh_host_key_ecdsa_public_keytype": "ecdsa-sha2-nistp256", "ansible_ssh_host_key_ed25519_public": "AAAAC3NzaC1lZDI1NTE5AAAAIDVN13dHSxa36Blsqt/Q8OyOA04CC7ZlvrS6zWL4aDyE", "ansible_ssh_host_key_ed25519_public_keytype": "ssh-ed25519", "ansible_loadavg": {"1m": 0.3759765625, "5m": 0.177734375, "15m": 0.08837890625}, "ansible_fibre_channel_wwn": [], "ansible_fips": false, "ansible_pkg_mgr": "dnf", "ansible_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": "ttyS0,115200n8"}, "ansible_proc_cmdline": {"BOOT_IMAGE": "(hd0,gpt2)/boot/vmlinuz-6.11.0-0.rc6.23.el10.x86_64", "root": "UUID=4853857d-d3e2-44a6-8739-3837607c97e1", "ro": true, "rhgb": true, "crashkernel": "1G-4G:192M,4G-64G:256M,64G-:512M", "net.ifnames": "0", "console": ["tty0", "ttyS0,115200n8"]}, "ansible_iscsi_iqn": "", "ansible_date_time": {"year": "2024", "month": "09", "weekday": "Friday", "weekday_number": "5", "weeknumber": "38", "day": "20", "hour": "21", "minute": "29", "second": "35", "epoch": "1726882175", "epoch_int": "1726882175", "date": "2024-09-20", "time": "21:29:35", "iso8601_micro": "2024-09-21T01:29:35.465068Z", "iso8601": "2024-09-21T01:29:35Z", "iso8601_basic": "20240920T212935465068", "iso8601_basic_short": "20240920T212935", "tz": "EDT", "tz_dst": "EDT", "tz_offset": "-0400"}, "ansible_python": {"version": {"major": 3, "minor": 12, "micro": 5, "releaselevel": "final", "serial": 0}, "version_info": [3, 12, 5, "final", 0], "executable": "/usr/bin/python3.12", "has_sslcontext": true, "type": "cpython"}, "ansible_apparmor": {"status": "disabled"}, "gather_subset": ["all"], "module_setup": true}, "invocation": {"module_args": {"gather_subset": ["all"], "gather_timeout": 10, "filter": [], "fact_path": "/etc/ansible/facts.d"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882175.51304: done with _execute_module (ansible.legacy.setup, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.setup', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882174.5434391-11755-141765771927238/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882175.51322: _low_level_execute_command(): starting 11579 1726882175.51390: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882174.5434391-11755-141765771927238/ > /dev/null 2>&1 && sleep 0' 11579 1726882175.52902: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882175.52961: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882175.53036: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882175.53061: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882175.53234: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882175.55873: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882175.55927: stderr chunk (state=3): >>><<< 11579 1726882175.56011: stdout chunk (state=3): >>><<< 11579 1726882175.56014: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882175.56204: handler run complete 11579 1726882175.56342: variable 'ansible_facts' from source: unknown 11579 1726882175.56623: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882175.57432: variable 'ansible_facts' from source: unknown 11579 1726882175.57505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882175.57806: attempt loop complete, returning result 11579 1726882175.57816: _execute() done 11579 1726882175.57823: dumping result to json 11579 1726882175.57858: done dumping result, returning 11579 1726882175.58002: done running TaskExecutor() for managed_node1/TASK: Gathering Facts [12673a56-9f93-f197-7423-000000000128] 11579 1726882175.58005: sending task result for task 12673a56-9f93-f197-7423-000000000128 11579 1726882175.58844: done sending task result for task 12673a56-9f93-f197-7423-000000000128 11579 1726882175.58847: WORKER PROCESS EXITING ok: [managed_node1] 11579 1726882175.59418: no more pending results, returning what we have 11579 1726882175.59421: results queue empty 11579 1726882175.59422: checking for any_errors_fatal 11579 1726882175.59423: done checking for any_errors_fatal 11579 1726882175.59423: checking for max_fail_percentage 11579 1726882175.59425: done checking for max_fail_percentage 11579 1726882175.59425: checking to see if all hosts have failed and the running result is not ok 11579 1726882175.59426: done checking to see if all hosts have failed 11579 1726882175.59427: getting the remaining hosts for this loop 11579 1726882175.59428: done getting the remaining hosts for this loop 11579 1726882175.59431: getting the next task for host managed_node1 11579 1726882175.59436: done getting next task for host managed_node1 11579 1726882175.59438: ^ task is: TASK: meta (flush_handlers) 11579 1726882175.59439: ^ state is: HOST STATE: block=1, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882175.59442: getting variables 11579 1726882175.59444: in VariableManager get_vars() 11579 1726882175.59591: Calling all_inventory to load vars for managed_node1 11579 1726882175.59599: Calling groups_inventory to load vars for managed_node1 11579 1726882175.59602: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882175.59612: Calling all_plugins_play to load vars for managed_node1 11579 1726882175.59615: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882175.59618: Calling groups_plugins_play to load vars for managed_node1 11579 1726882175.59887: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882175.60350: done with get_vars() 11579 1726882175.60361: done getting variables 11579 1726882175.60545: in VariableManager get_vars() 11579 1726882175.60603: Calling all_inventory to load vars for managed_node1 11579 1726882175.60606: Calling groups_inventory to load vars for managed_node1 11579 1726882175.60608: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882175.60613: Calling all_plugins_play to load vars for managed_node1 11579 1726882175.60616: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882175.60618: Calling groups_plugins_play to load vars for managed_node1 11579 1726882175.60934: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882175.61458: done with get_vars() 11579 1726882175.61471: done queuing things up, now waiting for results queue to drain 11579 1726882175.61473: results queue empty 11579 1726882175.61474: checking for any_errors_fatal 11579 1726882175.61477: done checking for any_errors_fatal 11579 1726882175.61478: checking for max_fail_percentage 11579 1726882175.61479: done checking for max_fail_percentage 11579 1726882175.61479: checking to see if all hosts have failed and the running result is not ok 11579 1726882175.61480: done checking to see if all hosts have failed 11579 1726882175.61486: getting the remaining hosts for this loop 11579 1726882175.61487: done getting the remaining hosts for this loop 11579 1726882175.61489: getting the next task for host managed_node1 11579 1726882175.61545: done getting next task for host managed_node1 11579 1726882175.61549: ^ task is: TASK: INIT Prepare setup 11579 1726882175.61550: ^ state is: HOST STATE: block=2, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882175.61552: getting variables 11579 1726882175.61553: in VariableManager get_vars() 11579 1726882175.61567: Calling all_inventory to load vars for managed_node1 11579 1726882175.61569: Calling groups_inventory to load vars for managed_node1 11579 1726882175.61571: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882175.61575: Calling all_plugins_play to load vars for managed_node1 11579 1726882175.61577: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882175.61579: Calling groups_plugins_play to load vars for managed_node1 11579 1726882175.61955: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882175.62382: done with get_vars() 11579 1726882175.62396: done getting variables 11579 1726882175.62576: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [INIT Prepare setup] ****************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:15 Friday 20 September 2024 21:29:35 -0400 (0:00:01.160) 0:00:04.334 ****** 11579 1726882175.62611: entering _queue_task() for managed_node1/debug 11579 1726882175.62612: Creating lock for debug 11579 1726882175.63322: worker is 1 (out of 1 available) 11579 1726882175.63622: exiting _queue_task() for managed_node1/debug 11579 1726882175.63633: done queuing things up, now waiting for results queue to drain 11579 1726882175.63634: waiting for pending results... 11579 1726882175.63987: running TaskExecutor() for managed_node1/TASK: INIT Prepare setup 11579 1726882175.64198: in run() - task 12673a56-9f93-f197-7423-00000000000b 11579 1726882175.64234: variable 'ansible_search_path' from source: unknown 11579 1726882175.64317: calling self._execute() 11579 1726882175.64630: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882175.64635: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882175.64638: variable 'omit' from source: magic vars 11579 1726882175.65711: variable 'ansible_distribution_major_version' from source: facts 11579 1726882175.65715: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882175.65717: variable 'omit' from source: magic vars 11579 1726882175.65720: variable 'omit' from source: magic vars 11579 1726882175.65722: variable 'omit' from source: magic vars 11579 1726882175.65724: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882175.65727: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882175.65729: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882175.65884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882175.65906: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882175.66051: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882175.66060: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882175.66068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882175.66306: Set connection var ansible_timeout to 10 11579 1726882175.66318: Set connection var ansible_shell_type to sh 11579 1726882175.66329: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882175.66339: Set connection var ansible_shell_executable to /bin/sh 11579 1726882175.66372: Set connection var ansible_pipelining to False 11579 1726882175.66379: Set connection var ansible_connection to ssh 11579 1726882175.66582: variable 'ansible_shell_executable' from source: unknown 11579 1726882175.66585: variable 'ansible_connection' from source: unknown 11579 1726882175.66588: variable 'ansible_module_compression' from source: unknown 11579 1726882175.66590: variable 'ansible_shell_type' from source: unknown 11579 1726882175.66596: variable 'ansible_shell_executable' from source: unknown 11579 1726882175.66598: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882175.66600: variable 'ansible_pipelining' from source: unknown 11579 1726882175.66602: variable 'ansible_timeout' from source: unknown 11579 1726882175.66604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882175.66768: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882175.66917: variable 'omit' from source: magic vars 11579 1726882175.67001: starting attempt loop 11579 1726882175.67006: running the handler 11579 1726882175.67008: handler run complete 11579 1726882175.67020: attempt loop complete, returning result 11579 1726882175.67109: _execute() done 11579 1726882175.67125: dumping result to json 11579 1726882175.67218: done dumping result, returning 11579 1726882175.67222: done running TaskExecutor() for managed_node1/TASK: INIT Prepare setup [12673a56-9f93-f197-7423-00000000000b] 11579 1726882175.67225: sending task result for task 12673a56-9f93-f197-7423-00000000000b 11579 1726882175.67301: done sending task result for task 12673a56-9f93-f197-7423-00000000000b 11579 1726882175.67304: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: ################################################## 11579 1726882175.67357: no more pending results, returning what we have 11579 1726882175.67361: results queue empty 11579 1726882175.67361: checking for any_errors_fatal 11579 1726882175.67363: done checking for any_errors_fatal 11579 1726882175.67363: checking for max_fail_percentage 11579 1726882175.67365: done checking for max_fail_percentage 11579 1726882175.67366: checking to see if all hosts have failed and the running result is not ok 11579 1726882175.67367: done checking to see if all hosts have failed 11579 1726882175.67368: getting the remaining hosts for this loop 11579 1726882175.67369: done getting the remaining hosts for this loop 11579 1726882175.67373: getting the next task for host managed_node1 11579 1726882175.67380: done getting next task for host managed_node1 11579 1726882175.67384: ^ task is: TASK: Install dnsmasq 11579 1726882175.67387: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882175.67391: getting variables 11579 1726882175.67397: in VariableManager get_vars() 11579 1726882175.67443: Calling all_inventory to load vars for managed_node1 11579 1726882175.67446: Calling groups_inventory to load vars for managed_node1 11579 1726882175.67448: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882175.67459: Calling all_plugins_play to load vars for managed_node1 11579 1726882175.67462: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882175.67465: Calling groups_plugins_play to load vars for managed_node1 11579 1726882175.68253: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882175.68687: done with get_vars() 11579 1726882175.68817: done getting variables 11579 1726882175.68876: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install dnsmasq] ********************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:29:35 -0400 (0:00:00.062) 0:00:04.397 ****** 11579 1726882175.68998: entering _queue_task() for managed_node1/package 11579 1726882175.69607: worker is 1 (out of 1 available) 11579 1726882175.69620: exiting _queue_task() for managed_node1/package 11579 1726882175.69631: done queuing things up, now waiting for results queue to drain 11579 1726882175.69632: waiting for pending results... 11579 1726882175.70144: running TaskExecutor() for managed_node1/TASK: Install dnsmasq 11579 1726882175.70200: in run() - task 12673a56-9f93-f197-7423-00000000000f 11579 1726882175.70204: variable 'ansible_search_path' from source: unknown 11579 1726882175.70209: variable 'ansible_search_path' from source: unknown 11579 1726882175.70300: calling self._execute() 11579 1726882175.70359: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882175.70371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882175.70386: variable 'omit' from source: magic vars 11579 1726882175.70781: variable 'ansible_distribution_major_version' from source: facts 11579 1726882175.70803: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882175.70813: variable 'omit' from source: magic vars 11579 1726882175.70867: variable 'omit' from source: magic vars 11579 1726882175.71105: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882175.73335: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882175.73415: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882175.73455: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882175.73614: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882175.73618: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882175.73640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882175.73671: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882175.73709: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882175.73763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882175.73783: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882175.73908: variable '__network_is_ostree' from source: set_fact 11579 1726882175.73924: variable 'omit' from source: magic vars 11579 1726882175.73966: variable 'omit' from source: magic vars 11579 1726882175.74006: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882175.74059: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882175.74084: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882175.74124: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882175.74144: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882175.74432: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882175.74436: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882175.74438: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882175.74440: Set connection var ansible_timeout to 10 11579 1726882175.74449: Set connection var ansible_shell_type to sh 11579 1726882175.74461: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882175.74470: Set connection var ansible_shell_executable to /bin/sh 11579 1726882175.74481: Set connection var ansible_pipelining to False 11579 1726882175.74487: Set connection var ansible_connection to ssh 11579 1726882175.74524: variable 'ansible_shell_executable' from source: unknown 11579 1726882175.74537: variable 'ansible_connection' from source: unknown 11579 1726882175.74549: variable 'ansible_module_compression' from source: unknown 11579 1726882175.74557: variable 'ansible_shell_type' from source: unknown 11579 1726882175.74564: variable 'ansible_shell_executable' from source: unknown 11579 1726882175.74571: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882175.74580: variable 'ansible_pipelining' from source: unknown 11579 1726882175.74587: variable 'ansible_timeout' from source: unknown 11579 1726882175.74601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882175.75105: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882175.75109: variable 'omit' from source: magic vars 11579 1726882175.75113: starting attempt loop 11579 1726882175.75116: running the handler 11579 1726882175.75118: variable 'ansible_facts' from source: unknown 11579 1726882175.75120: variable 'ansible_facts' from source: unknown 11579 1726882175.75122: _low_level_execute_command(): starting 11579 1726882175.75124: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882175.76342: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882175.76356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882175.76371: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882175.76388: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882175.76447: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882175.76460: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882175.76515: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882175.76558: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882175.76576: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882175.76632: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882175.76678: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882175.78979: stdout chunk (state=3): >>>/root <<< 11579 1726882175.79401: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882175.79412: stdout chunk (state=3): >>><<< 11579 1726882175.79415: stderr chunk (state=3): >>><<< 11579 1726882175.79417: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882175.79420: _low_level_execute_command(): starting 11579 1726882175.79422: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882175.7931602-11786-156505697202136 `" && echo ansible-tmp-1726882175.7931602-11786-156505697202136="` echo /root/.ansible/tmp/ansible-tmp-1726882175.7931602-11786-156505697202136 `" ) && sleep 0' 11579 1726882175.80542: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882175.80679: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882175.80700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882175.80710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882175.80789: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882175.80869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882175.80917: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882175.80961: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882175.81016: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882175.84001: stdout chunk (state=3): >>>ansible-tmp-1726882175.7931602-11786-156505697202136=/root/.ansible/tmp/ansible-tmp-1726882175.7931602-11786-156505697202136 <<< 11579 1726882175.84041: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882175.84045: stdout chunk (state=3): >>><<< 11579 1726882175.84052: stderr chunk (state=3): >>><<< 11579 1726882175.84079: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882175.7931602-11786-156505697202136=/root/.ansible/tmp/ansible-tmp-1726882175.7931602-11786-156505697202136 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882175.84115: variable 'ansible_module_compression' from source: unknown 11579 1726882175.84324: ANSIBALLZ: Using generic lock for ansible.legacy.dnf 11579 1726882175.84327: ANSIBALLZ: Acquiring lock 11579 1726882175.84330: ANSIBALLZ: Lock acquired: 139873763448672 11579 1726882175.84332: ANSIBALLZ: Creating module 11579 1726882176.15899: ANSIBALLZ: Writing module into payload 11579 1726882176.16302: ANSIBALLZ: Writing module 11579 1726882176.16321: ANSIBALLZ: Renaming module 11579 1726882176.16324: ANSIBALLZ: Done creating module 11579 1726882176.16346: variable 'ansible_facts' from source: unknown 11579 1726882176.16470: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882175.7931602-11786-156505697202136/AnsiballZ_dnf.py 11579 1726882176.16620: Sending initial data 11579 1726882176.16623: Sent initial data (152 bytes) 11579 1726882176.17918: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882176.18168: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882176.18176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882176.20498: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882176.20508: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882176.20600: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpv5f05p0m /root/.ansible/tmp/ansible-tmp-1726882175.7931602-11786-156505697202136/AnsiballZ_dnf.py <<< 11579 1726882176.20609: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882175.7931602-11786-156505697202136/AnsiballZ_dnf.py" <<< 11579 1726882176.20707: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpv5f05p0m" to remote "/root/.ansible/tmp/ansible-tmp-1726882175.7931602-11786-156505697202136/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882175.7931602-11786-156505697202136/AnsiballZ_dnf.py" <<< 11579 1726882176.21849: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882176.21852: stdout chunk (state=3): >>><<< 11579 1726882176.21860: stderr chunk (state=3): >>><<< 11579 1726882176.21884: done transferring module to remote 11579 1726882176.21898: _low_level_execute_command(): starting 11579 1726882176.21901: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882175.7931602-11786-156505697202136/ /root/.ansible/tmp/ansible-tmp-1726882175.7931602-11786-156505697202136/AnsiballZ_dnf.py && sleep 0' 11579 1726882176.22599: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882176.22603: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882176.22605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882176.22608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882176.22610: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882176.22612: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882176.22617: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882176.22632: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882176.22676: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882176.22749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882176.22765: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882176.22844: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882176.25454: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882176.25457: stdout chunk (state=3): >>><<< 11579 1726882176.25459: stderr chunk (state=3): >>><<< 11579 1726882176.25522: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882176.25642: _low_level_execute_command(): starting 11579 1726882176.25646: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882175.7931602-11786-156505697202136/AnsiballZ_dnf.py && sleep 0' 11579 1726882176.26313: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882176.26374: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882176.26464: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882176.26597: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882176.26709: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882177.81914: stdout chunk (state=3): >>> {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-3.el10.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}}<<< 11579 1726882177.82011: stdout chunk (state=3): >>> <<< 11579 1726882177.87901: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882177.87934: stderr chunk (state=3): >>><<< 11579 1726882177.87938: stdout chunk (state=3): >>><<< 11579 1726882177.87966: _low_level_execute_command() done: rc=0, stdout= {"msg": "", "changed": true, "results": ["Installed: dnsmasq-2.90-3.el10.x86_64"], "rc": 0, "invocation": {"module_args": {"name": ["dnsmasq"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882177.88009: done with _execute_module (ansible.legacy.dnf, {'name': 'dnsmasq', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882175.7931602-11786-156505697202136/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882177.88015: _low_level_execute_command(): starting 11579 1726882177.88020: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882175.7931602-11786-156505697202136/ > /dev/null 2>&1 && sleep 0' 11579 1726882177.88478: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882177.88516: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882177.88519: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882177.88521: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882177.88523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 11579 1726882177.88525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882177.88586: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882177.88590: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882177.88592: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882177.88655: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882177.90758: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882177.90784: stderr chunk (state=3): >>><<< 11579 1726882177.90787: stdout chunk (state=3): >>><<< 11579 1726882177.90815: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882177.90818: handler run complete 11579 1726882177.90943: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882177.91068: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882177.91101: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882177.91128: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882177.91148: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882177.91202: variable '__install_status' from source: unknown 11579 1726882177.91217: Evaluated conditional (__install_status is success): True 11579 1726882177.91230: attempt loop complete, returning result 11579 1726882177.91233: _execute() done 11579 1726882177.91235: dumping result to json 11579 1726882177.91242: done dumping result, returning 11579 1726882177.91247: done running TaskExecutor() for managed_node1/TASK: Install dnsmasq [12673a56-9f93-f197-7423-00000000000f] 11579 1726882177.91251: sending task result for task 12673a56-9f93-f197-7423-00000000000f 11579 1726882177.91348: done sending task result for task 12673a56-9f93-f197-7423-00000000000f 11579 1726882177.91351: WORKER PROCESS EXITING changed: [managed_node1] => { "attempts": 1, "changed": true, "rc": 0, "results": [ "Installed: dnsmasq-2.90-3.el10.x86_64" ] } 11579 1726882177.91431: no more pending results, returning what we have 11579 1726882177.91434: results queue empty 11579 1726882177.91435: checking for any_errors_fatal 11579 1726882177.91440: done checking for any_errors_fatal 11579 1726882177.91440: checking for max_fail_percentage 11579 1726882177.91441: done checking for max_fail_percentage 11579 1726882177.91442: checking to see if all hosts have failed and the running result is not ok 11579 1726882177.91443: done checking to see if all hosts have failed 11579 1726882177.91444: getting the remaining hosts for this loop 11579 1726882177.91445: done getting the remaining hosts for this loop 11579 1726882177.91448: getting the next task for host managed_node1 11579 1726882177.91453: done getting next task for host managed_node1 11579 1726882177.91458: ^ task is: TASK: Install pgrep, sysctl 11579 1726882177.91462: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882177.91465: getting variables 11579 1726882177.91470: in VariableManager get_vars() 11579 1726882177.91538: Calling all_inventory to load vars for managed_node1 11579 1726882177.91541: Calling groups_inventory to load vars for managed_node1 11579 1726882177.91543: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882177.91553: Calling all_plugins_play to load vars for managed_node1 11579 1726882177.91555: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882177.91558: Calling groups_plugins_play to load vars for managed_node1 11579 1726882177.91711: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882177.91837: done with get_vars() 11579 1726882177.91846: done getting variables 11579 1726882177.91886: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:17 Friday 20 September 2024 21:29:37 -0400 (0:00:02.230) 0:00:06.627 ****** 11579 1726882177.91914: entering _queue_task() for managed_node1/package 11579 1726882177.92162: worker is 1 (out of 1 available) 11579 1726882177.92173: exiting _queue_task() for managed_node1/package 11579 1726882177.92186: done queuing things up, now waiting for results queue to drain 11579 1726882177.92188: waiting for pending results... 11579 1726882177.92624: running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl 11579 1726882177.92628: in run() - task 12673a56-9f93-f197-7423-000000000010 11579 1726882177.92640: variable 'ansible_search_path' from source: unknown 11579 1726882177.92656: variable 'ansible_search_path' from source: unknown 11579 1726882177.92719: calling self._execute() 11579 1726882177.92848: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882177.92866: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882177.92884: variable 'omit' from source: magic vars 11579 1726882177.93341: variable 'ansible_distribution_major_version' from source: facts 11579 1726882177.93360: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882177.93499: variable 'ansible_os_family' from source: facts 11579 1726882177.93520: Evaluated conditional (ansible_os_family == 'RedHat'): True 11579 1726882177.93816: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882177.94256: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882177.94319: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882177.94360: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882177.94384: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882177.94501: variable 'ansible_distribution_major_version' from source: facts 11579 1726882177.94525: Evaluated conditional (ansible_distribution_major_version is version('6', '<=')): False 11579 1726882177.94528: when evaluation is False, skipping this task 11579 1726882177.94698: _execute() done 11579 1726882177.94702: dumping result to json 11579 1726882177.94704: done dumping result, returning 11579 1726882177.94708: done running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl [12673a56-9f93-f197-7423-000000000010] 11579 1726882177.94710: sending task result for task 12673a56-9f93-f197-7423-000000000010 11579 1726882177.94806: done sending task result for task 12673a56-9f93-f197-7423-000000000010 11579 1726882177.94810: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version is version('6', '<=')", "skip_reason": "Conditional result was False" } 11579 1726882177.94863: no more pending results, returning what we have 11579 1726882177.94865: results queue empty 11579 1726882177.94866: checking for any_errors_fatal 11579 1726882177.94872: done checking for any_errors_fatal 11579 1726882177.94873: checking for max_fail_percentage 11579 1726882177.94874: done checking for max_fail_percentage 11579 1726882177.94875: checking to see if all hosts have failed and the running result is not ok 11579 1726882177.94876: done checking to see if all hosts have failed 11579 1726882177.94880: getting the remaining hosts for this loop 11579 1726882177.94882: done getting the remaining hosts for this loop 11579 1726882177.94889: getting the next task for host managed_node1 11579 1726882177.94901: done getting next task for host managed_node1 11579 1726882177.94904: ^ task is: TASK: Install pgrep, sysctl 11579 1726882177.94907: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882177.94913: getting variables 11579 1726882177.94918: in VariableManager get_vars() 11579 1726882177.94970: Calling all_inventory to load vars for managed_node1 11579 1726882177.94974: Calling groups_inventory to load vars for managed_node1 11579 1726882177.94979: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882177.94995: Calling all_plugins_play to load vars for managed_node1 11579 1726882177.95002: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882177.95006: Calling groups_plugins_play to load vars for managed_node1 11579 1726882177.95257: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882177.95450: done with get_vars() 11579 1726882177.95457: done getting variables 11579 1726882177.95500: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Install pgrep, sysctl] *************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 Friday 20 September 2024 21:29:37 -0400 (0:00:00.036) 0:00:06.663 ****** 11579 1726882177.95524: entering _queue_task() for managed_node1/package 11579 1726882177.96207: worker is 1 (out of 1 available) 11579 1726882177.96218: exiting _queue_task() for managed_node1/package 11579 1726882177.96231: done queuing things up, now waiting for results queue to drain 11579 1726882177.96234: waiting for pending results... 11579 1726882177.96705: running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl 11579 1726882177.96713: in run() - task 12673a56-9f93-f197-7423-000000000011 11579 1726882177.96718: variable 'ansible_search_path' from source: unknown 11579 1726882177.96721: variable 'ansible_search_path' from source: unknown 11579 1726882177.96744: calling self._execute() 11579 1726882177.96822: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882177.96834: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882177.96848: variable 'omit' from source: magic vars 11579 1726882177.97274: variable 'ansible_distribution_major_version' from source: facts 11579 1726882177.97301: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882177.97599: variable 'ansible_os_family' from source: facts 11579 1726882177.97602: Evaluated conditional (ansible_os_family == 'RedHat'): True 11579 1726882177.97605: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882177.97840: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882177.97886: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882177.97927: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882177.97961: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882177.98039: variable 'ansible_distribution_major_version' from source: facts 11579 1726882177.98054: Evaluated conditional (ansible_distribution_major_version is version('7', '>=')): True 11579 1726882177.98062: variable 'omit' from source: magic vars 11579 1726882177.98112: variable 'omit' from source: magic vars 11579 1726882177.98600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882178.00835: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882178.00911: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882178.00951: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882178.00989: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882178.01024: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882178.01119: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882178.01150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882178.01180: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882178.01230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882178.01251: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882178.01351: variable '__network_is_ostree' from source: set_fact 11579 1726882178.01361: variable 'omit' from source: magic vars 11579 1726882178.01398: variable 'omit' from source: magic vars 11579 1726882178.01428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882178.01459: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882178.01482: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882178.01508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882178.01524: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882178.01557: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882178.01565: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882178.01573: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882178.01673: Set connection var ansible_timeout to 10 11579 1726882178.01683: Set connection var ansible_shell_type to sh 11579 1726882178.01692: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882178.01707: Set connection var ansible_shell_executable to /bin/sh 11579 1726882178.01716: Set connection var ansible_pipelining to False 11579 1726882178.01721: Set connection var ansible_connection to ssh 11579 1726882178.01743: variable 'ansible_shell_executable' from source: unknown 11579 1726882178.01748: variable 'ansible_connection' from source: unknown 11579 1726882178.01753: variable 'ansible_module_compression' from source: unknown 11579 1726882178.01758: variable 'ansible_shell_type' from source: unknown 11579 1726882178.01762: variable 'ansible_shell_executable' from source: unknown 11579 1726882178.01767: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882178.01773: variable 'ansible_pipelining' from source: unknown 11579 1726882178.01778: variable 'ansible_timeout' from source: unknown 11579 1726882178.01783: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882178.01870: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882178.02007: variable 'omit' from source: magic vars 11579 1726882178.02018: starting attempt loop 11579 1726882178.02039: running the handler 11579 1726882178.02051: variable 'ansible_facts' from source: unknown 11579 1726882178.02057: variable 'ansible_facts' from source: unknown 11579 1726882178.02145: _low_level_execute_command(): starting 11579 1726882178.02148: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882178.03029: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882178.03084: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882178.03106: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882178.03622: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882178.03680: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882178.05255: stdout chunk (state=3): >>>/root <<< 11579 1726882178.05404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882178.05408: stdout chunk (state=3): >>><<< 11579 1726882178.05411: stderr chunk (state=3): >>><<< 11579 1726882178.05534: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882178.05538: _low_level_execute_command(): starting 11579 1726882178.05541: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882178.054426-11890-95887332141613 `" && echo ansible-tmp-1726882178.054426-11890-95887332141613="` echo /root/.ansible/tmp/ansible-tmp-1726882178.054426-11890-95887332141613 `" ) && sleep 0' 11579 1726882178.06885: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882178.06906: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882178.06918: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882178.06972: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882178.07145: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882178.07288: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882178.07291: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882178.09083: stdout chunk (state=3): >>>ansible-tmp-1726882178.054426-11890-95887332141613=/root/.ansible/tmp/ansible-tmp-1726882178.054426-11890-95887332141613 <<< 11579 1726882178.09312: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882178.09324: stdout chunk (state=3): >>><<< 11579 1726882178.09335: stderr chunk (state=3): >>><<< 11579 1726882178.09358: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882178.054426-11890-95887332141613=/root/.ansible/tmp/ansible-tmp-1726882178.054426-11890-95887332141613 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882178.09401: variable 'ansible_module_compression' from source: unknown 11579 1726882178.09477: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.dnf-ZIP_DEFLATED 11579 1726882178.09542: variable 'ansible_facts' from source: unknown 11579 1726882178.09679: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882178.054426-11890-95887332141613/AnsiballZ_dnf.py 11579 1726882178.09925: Sending initial data 11579 1726882178.09942: Sent initial data (150 bytes) 11579 1726882178.10514: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882178.10601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882178.10741: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882178.10782: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882178.12291: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11579 1726882178.12318: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882178.12364: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882178.12426: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmps1q3a81y /root/.ansible/tmp/ansible-tmp-1726882178.054426-11890-95887332141613/AnsiballZ_dnf.py <<< 11579 1726882178.12430: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882178.054426-11890-95887332141613/AnsiballZ_dnf.py" <<< 11579 1726882178.12483: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmps1q3a81y" to remote "/root/.ansible/tmp/ansible-tmp-1726882178.054426-11890-95887332141613/AnsiballZ_dnf.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882178.054426-11890-95887332141613/AnsiballZ_dnf.py" <<< 11579 1726882178.13639: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882178.13643: stdout chunk (state=3): >>><<< 11579 1726882178.13645: stderr chunk (state=3): >>><<< 11579 1726882178.13647: done transferring module to remote 11579 1726882178.13649: _low_level_execute_command(): starting 11579 1726882178.13651: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882178.054426-11890-95887332141613/ /root/.ansible/tmp/ansible-tmp-1726882178.054426-11890-95887332141613/AnsiballZ_dnf.py && sleep 0' 11579 1726882178.14230: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882178.14245: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882178.14325: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882178.14501: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882178.14522: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882178.14552: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882178.14653: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882178.16349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882178.16409: stdout chunk (state=3): >>><<< 11579 1726882178.16412: stderr chunk (state=3): >>><<< 11579 1726882178.16699: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882178.16703: _low_level_execute_command(): starting 11579 1726882178.16706: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882178.054426-11890-95887332141613/AnsiballZ_dnf.py && sleep 0' 11579 1726882178.17788: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882178.17810: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882178.18011: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882178.18028: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882178.18045: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882178.18068: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882178.18152: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882178.58709: stdout chunk (state=3): >>> {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} <<< 11579 1726882178.62667: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882178.62711: stderr chunk (state=3): >>><<< 11579 1726882178.62716: stdout chunk (state=3): >>><<< 11579 1726882178.62811: _low_level_execute_command() done: rc=0, stdout= {"msg": "Nothing to do", "changed": false, "results": [], "rc": 0, "invocation": {"module_args": {"name": ["procps-ng"], "state": "present", "allow_downgrade": false, "allowerasing": false, "autoremove": false, "bugfix": false, "cacheonly": false, "disable_gpg_check": false, "disable_plugin": [], "disablerepo": [], "download_only": false, "enable_plugin": [], "enablerepo": [], "exclude": [], "installroot": "/", "install_repoquery": true, "install_weak_deps": true, "security": false, "skip_broken": false, "update_cache": false, "update_only": false, "validate_certs": true, "sslverify": true, "lock_timeout": 30, "use_backend": "auto", "best": null, "conf_file": null, "disable_excludes": null, "download_dir": null, "list": null, "nobest": null, "releasever": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882178.62820: done with _execute_module (ansible.legacy.dnf, {'name': 'procps-ng', 'state': 'present', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.dnf', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882178.054426-11890-95887332141613/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882178.62823: _low_level_execute_command(): starting 11579 1726882178.62828: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882178.054426-11890-95887332141613/ > /dev/null 2>&1 && sleep 0' 11579 1726882178.63423: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882178.63430: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882178.63433: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882178.63468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882178.63500: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882178.63503: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882178.63505: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882178.63615: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882178.63618: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882178.63635: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882178.63681: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882178.65501: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882178.65534: stderr chunk (state=3): >>><<< 11579 1726882178.65537: stdout chunk (state=3): >>><<< 11579 1726882178.65550: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882178.65556: handler run complete 11579 1726882178.65579: attempt loop complete, returning result 11579 1726882178.65582: _execute() done 11579 1726882178.65584: dumping result to json 11579 1726882178.65590: done dumping result, returning 11579 1726882178.65600: done running TaskExecutor() for managed_node1/TASK: Install pgrep, sysctl [12673a56-9f93-f197-7423-000000000011] 11579 1726882178.65605: sending task result for task 12673a56-9f93-f197-7423-000000000011 11579 1726882178.65704: done sending task result for task 12673a56-9f93-f197-7423-000000000011 11579 1726882178.65706: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "rc": 0, "results": [] } MSG: Nothing to do 11579 1726882178.65796: no more pending results, returning what we have 11579 1726882178.65799: results queue empty 11579 1726882178.65800: checking for any_errors_fatal 11579 1726882178.65806: done checking for any_errors_fatal 11579 1726882178.65807: checking for max_fail_percentage 11579 1726882178.65808: done checking for max_fail_percentage 11579 1726882178.65809: checking to see if all hosts have failed and the running result is not ok 11579 1726882178.65810: done checking to see if all hosts have failed 11579 1726882178.65811: getting the remaining hosts for this loop 11579 1726882178.65812: done getting the remaining hosts for this loop 11579 1726882178.65816: getting the next task for host managed_node1 11579 1726882178.65821: done getting next task for host managed_node1 11579 1726882178.65823: ^ task is: TASK: Create test interfaces 11579 1726882178.65826: ^ state is: HOST STATE: block=2, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882178.65830: getting variables 11579 1726882178.65832: in VariableManager get_vars() 11579 1726882178.65871: Calling all_inventory to load vars for managed_node1 11579 1726882178.65874: Calling groups_inventory to load vars for managed_node1 11579 1726882178.65876: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882178.65885: Calling all_plugins_play to load vars for managed_node1 11579 1726882178.65887: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882178.65890: Calling groups_plugins_play to load vars for managed_node1 11579 1726882178.66037: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882178.66160: done with get_vars() 11579 1726882178.66169: done getting variables 11579 1726882178.66241: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [Create test interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Friday 20 September 2024 21:29:38 -0400 (0:00:00.707) 0:00:07.371 ****** 11579 1726882178.66262: entering _queue_task() for managed_node1/shell 11579 1726882178.66263: Creating lock for shell 11579 1726882178.66565: worker is 1 (out of 1 available) 11579 1726882178.66577: exiting _queue_task() for managed_node1/shell 11579 1726882178.66588: done queuing things up, now waiting for results queue to drain 11579 1726882178.66589: waiting for pending results... 11579 1726882178.66814: running TaskExecutor() for managed_node1/TASK: Create test interfaces 11579 1726882178.66869: in run() - task 12673a56-9f93-f197-7423-000000000012 11579 1726882178.66922: variable 'ansible_search_path' from source: unknown 11579 1726882178.66927: variable 'ansible_search_path' from source: unknown 11579 1726882178.67018: calling self._execute() 11579 1726882178.67031: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882178.67036: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882178.67053: variable 'omit' from source: magic vars 11579 1726882178.67341: variable 'ansible_distribution_major_version' from source: facts 11579 1726882178.67351: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882178.67356: variable 'omit' from source: magic vars 11579 1726882178.67391: variable 'omit' from source: magic vars 11579 1726882178.67748: variable 'dhcp_interface1' from source: play vars 11579 1726882178.67752: variable 'dhcp_interface2' from source: play vars 11579 1726882178.67777: variable 'omit' from source: magic vars 11579 1726882178.67810: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882178.67840: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882178.67855: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882178.67868: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882178.67877: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882178.67904: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882178.67908: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882178.67910: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882178.67988: Set connection var ansible_timeout to 10 11579 1726882178.67992: Set connection var ansible_shell_type to sh 11579 1726882178.68003: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882178.68008: Set connection var ansible_shell_executable to /bin/sh 11579 1726882178.68014: Set connection var ansible_pipelining to False 11579 1726882178.68017: Set connection var ansible_connection to ssh 11579 1726882178.68033: variable 'ansible_shell_executable' from source: unknown 11579 1726882178.68036: variable 'ansible_connection' from source: unknown 11579 1726882178.68038: variable 'ansible_module_compression' from source: unknown 11579 1726882178.68040: variable 'ansible_shell_type' from source: unknown 11579 1726882178.68044: variable 'ansible_shell_executable' from source: unknown 11579 1726882178.68046: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882178.68049: variable 'ansible_pipelining' from source: unknown 11579 1726882178.68052: variable 'ansible_timeout' from source: unknown 11579 1726882178.68054: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882178.68151: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882178.68160: variable 'omit' from source: magic vars 11579 1726882178.68166: starting attempt loop 11579 1726882178.68168: running the handler 11579 1726882178.68183: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882178.68192: _low_level_execute_command(): starting 11579 1726882178.68202: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882178.68722: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882178.68726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882178.68729: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882178.68731: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 11579 1726882178.68733: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882178.68782: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882178.68785: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882178.68787: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882178.68836: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882178.70383: stdout chunk (state=3): >>>/root <<< 11579 1726882178.70489: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882178.70510: stderr chunk (state=3): >>><<< 11579 1726882178.70513: stdout chunk (state=3): >>><<< 11579 1726882178.70532: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882178.70543: _low_level_execute_command(): starting 11579 1726882178.70548: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882178.7053134-11934-107409202815963 `" && echo ansible-tmp-1726882178.7053134-11934-107409202815963="` echo /root/.ansible/tmp/ansible-tmp-1726882178.7053134-11934-107409202815963 `" ) && sleep 0' 11579 1726882178.71029: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882178.71032: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882178.71035: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 11579 1726882178.71037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882178.71039: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882178.71118: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882178.71156: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882178.73001: stdout chunk (state=3): >>>ansible-tmp-1726882178.7053134-11934-107409202815963=/root/.ansible/tmp/ansible-tmp-1726882178.7053134-11934-107409202815963 <<< 11579 1726882178.73111: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882178.73141: stderr chunk (state=3): >>><<< 11579 1726882178.73145: stdout chunk (state=3): >>><<< 11579 1726882178.73158: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882178.7053134-11934-107409202815963=/root/.ansible/tmp/ansible-tmp-1726882178.7053134-11934-107409202815963 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882178.73184: variable 'ansible_module_compression' from source: unknown 11579 1726882178.73277: ANSIBALLZ: Using generic lock for ansible.legacy.command 11579 1726882178.73282: ANSIBALLZ: Acquiring lock 11579 1726882178.73288: ANSIBALLZ: Lock acquired: 139873763448672 11579 1726882178.73290: ANSIBALLZ: Creating module 11579 1726882178.87202: ANSIBALLZ: Writing module into payload 11579 1726882178.87206: ANSIBALLZ: Writing module 11579 1726882178.87208: ANSIBALLZ: Renaming module 11579 1726882178.87210: ANSIBALLZ: Done creating module 11579 1726882178.87212: variable 'ansible_facts' from source: unknown 11579 1726882178.87448: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882178.7053134-11934-107409202815963/AnsiballZ_command.py 11579 1726882178.87902: Sending initial data 11579 1726882178.87905: Sent initial data (156 bytes) 11579 1726882178.89083: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882178.89403: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882178.89420: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882178.89462: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882178.89505: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882178.91027: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882178.91066: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882178.91109: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpxyxnwigp /root/.ansible/tmp/ansible-tmp-1726882178.7053134-11934-107409202815963/AnsiballZ_command.py <<< 11579 1726882178.91120: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882178.7053134-11934-107409202815963/AnsiballZ_command.py" <<< 11579 1726882178.91177: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpxyxnwigp" to remote "/root/.ansible/tmp/ansible-tmp-1726882178.7053134-11934-107409202815963/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882178.7053134-11934-107409202815963/AnsiballZ_command.py" <<< 11579 1726882178.92457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882178.92510: stderr chunk (state=3): >>><<< 11579 1726882178.92520: stdout chunk (state=3): >>><<< 11579 1726882178.92578: done transferring module to remote 11579 1726882178.92600: _low_level_execute_command(): starting 11579 1726882178.92611: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882178.7053134-11934-107409202815963/ /root/.ansible/tmp/ansible-tmp-1726882178.7053134-11934-107409202815963/AnsiballZ_command.py && sleep 0' 11579 1726882178.93289: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882178.93322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882178.93339: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882178.93417: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882178.95112: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882178.95149: stderr chunk (state=3): >>><<< 11579 1726882178.95255: stdout chunk (state=3): >>><<< 11579 1726882178.95265: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882178.95270: _low_level_execute_command(): starting 11579 1726882178.95272: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882178.7053134-11934-107409202815963/AnsiballZ_command.py && sleep 0' 11579 1726882178.95821: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882178.95883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882178.95940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882178.95971: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882180.32878: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 701 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 701 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/<<< 11579 1726882180.32911: stdout chunk (state=3): >>>show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:29:39.106335", "end": "2024-09-20 21:29:40.326543", "delta": "0:00:01.220208", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11579 1726882180.34529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882180.34533: stderr chunk (state=3): >>><<< 11579 1726882180.34536: stdout chunk (state=3): >>><<< 11579 1726882180.34866: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ ip link add test1 type veth peer name test1p\n+ ip link add test2 type veth peer name test2p\n++ pgrep NetworkManager\n+ '[' -n 701 ']'\n+ nmcli d set test1 managed true\n+ nmcli d set test2 managed true\n+ nmcli d set test1p managed false\n+ nmcli d set test2p managed false\n+ ip link set test1p up\n+ ip link set test2p up\n+ ip link add name testbr type bridge forward_delay 0\n++ pgrep NetworkManager\n+ '[' -n 701 ']'\n+ nmcli d set testbr managed false\n+ ip link set testbr up\n+ timer=0\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ let timer+=1\n+ '[' 1 -eq 30 ']'\n+ sleep 1\n+ rc=0\n+ ip addr add 192.0.2.1/24 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip -6 addr add 2001:DB8::1/32 dev testbr\n+ '[' 0 '!=' 0 ']'\n+ ip addr show testbr\n+ grep -q 'inet [1-9]'\n+ grep 'release 6' /etc/redhat-release\n+ ip link set test1p master testbr\n+ ip link set test2p master testbr\n+ systemctl is-active firewalld\ninactive\n+ dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "start": "2024-09-20 21:29:39.106335", "end": "2024-09-20 21:29:40.326543", "delta": "0:00:01.220208", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882180.34877: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n "$(pgrep NetworkManager)" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the \'testbr\' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n "$(pgrep NetworkManager)" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q \'inet [1-9]\'\ndo\n let "timer+=1"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc="$?"\n if [ "$rc" != 0 ]; then\n echo NOTICE - could not add testbr - error code "$rc"\n continue\n fi\ndone\n\nif grep \'release 6\' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo \'interface testbr {\' > /etc/radvd.conf\n echo \' AdvSendAdvert on;\' >> /etc/radvd.conf\n echo \' prefix 2001:DB8::/64 { \' >> /etc/radvd.conf\n echo \' AdvOnLink on; }; \' >> /etc/radvd.conf\n echo \' }; \' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service="$service"; then\n firewall-cmd --add-service "$service"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882178.7053134-11934-107409202815963/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882180.34880: _low_level_execute_command(): starting 11579 1726882180.35102: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882178.7053134-11934-107409202815963/ > /dev/null 2>&1 && sleep 0' 11579 1726882180.35967: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882180.35978: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882180.35992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882180.36080: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882180.36119: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882180.36131: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882180.36149: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882180.36225: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882180.38088: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882180.38092: stdout chunk (state=3): >>><<< 11579 1726882180.38102: stderr chunk (state=3): >>><<< 11579 1726882180.38126: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882180.38132: handler run complete 11579 1726882180.38160: Evaluated conditional (False): False 11579 1726882180.38170: attempt loop complete, returning result 11579 1726882180.38173: _execute() done 11579 1726882180.38176: dumping result to json 11579 1726882180.38183: done dumping result, returning 11579 1726882180.38196: done running TaskExecutor() for managed_node1/TASK: Create test interfaces [12673a56-9f93-f197-7423-000000000012] 11579 1726882180.38199: sending task result for task 12673a56-9f93-f197-7423-000000000012 ok: [managed_node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nip link add test1 type veth peer name test1p\nip link add test2 type veth peer name test2p\nif [ -n \"$(pgrep NetworkManager)\" ];then\n nmcli d set test1 managed true\n nmcli d set test2 managed true\n # NetworkManager should not manage DHCP server ports\n nmcli d set test1p managed false\n nmcli d set test2p managed false\nfi\nip link set test1p up\nip link set test2p up\n\n# Create the 'testbr' - providing both 10.x ipv4 and 2620:52:0 ipv6 dhcp\nip link add name testbr type bridge forward_delay 0\nif [ -n \"$(pgrep NetworkManager)\" ];then\n # NetworkManager should not manage DHCP server ports\n nmcli d set testbr managed false\nfi\nip link set testbr up\ntimer=0\n# The while loop following is a workaround for the NM bug, which can be\n# tracked in https://bugzilla.redhat.com/show_bug.cgi?id=2079642\nwhile ! ip addr show testbr | grep -q 'inet [1-9]'\ndo\n let \"timer+=1\"\n if [ $timer -eq 30 ]; then\n echo ERROR - could not add testbr\n ip addr\n exit 1\n fi\n sleep 1\n rc=0\n ip addr add 192.0.2.1/24 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\n ip -6 addr add 2001:DB8::1/32 dev testbr || rc=\"$?\"\n if [ \"$rc\" != 0 ]; then\n echo NOTICE - could not add testbr - error code \"$rc\"\n continue\n fi\ndone\n\nif grep 'release 6' /etc/redhat-release; then\n # We need bridge-utils and radvd only in rhel6\n if ! rpm -q --quiet radvd; then yum -y install radvd; fi\n if ! rpm -q --quiet bridge-utils; then yum -y install bridge-utils; fi\n\n # We need to add iptables rule to allow dhcp request\n iptables -I INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\n\n # Add test1, test2 peers into the testbr\n brctl addif testbr test1p\n brctl addif testbr test2p\n\n # in RHEL6 /run is not present\n mkdir -p /run\n\n # and dnsmasq does not support ipv6\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --interface=testbr --bind-interfaces\n\n # start radvd for ipv6\n echo 'interface testbr {' > /etc/radvd.conf\n echo ' AdvSendAdvert on;' >> /etc/radvd.conf\n echo ' prefix 2001:DB8::/64 { ' >> /etc/radvd.conf\n echo ' AdvOnLink on; }; ' >> /etc/radvd.conf\n echo ' }; ' >> /etc/radvd.conf\n\n # enable ipv6 forwarding\n sysctl -w net.ipv6.conf.all.forwarding=1\n service radvd restart\n\nelse\n ip link set test1p master testbr\n ip link set test2p master testbr\n # Run joint DHCP4/DHCP6 server with RA enabled in veth namespace\n if systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if ! firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --add-service \"$service\"\n fi\n done\n fi\n dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces\nfi\n", "delta": "0:00:01.220208", "end": "2024-09-20 21:29:40.326543", "rc": 0, "start": "2024-09-20 21:29:39.106335" } STDERR: + exec + ip link add test1 type veth peer name test1p + ip link add test2 type veth peer name test2p ++ pgrep NetworkManager + '[' -n 701 ']' + nmcli d set test1 managed true + nmcli d set test2 managed true + nmcli d set test1p managed false + nmcli d set test2p managed false + ip link set test1p up + ip link set test2p up + ip link add name testbr type bridge forward_delay 0 ++ pgrep NetworkManager + '[' -n 701 ']' + nmcli d set testbr managed false + ip link set testbr up + timer=0 + ip addr show testbr + grep -q 'inet [1-9]' + let timer+=1 + '[' 1 -eq 30 ']' + sleep 1 + rc=0 + ip addr add 192.0.2.1/24 dev testbr + '[' 0 '!=' 0 ']' + ip -6 addr add 2001:DB8::1/32 dev testbr + '[' 0 '!=' 0 ']' + ip addr show testbr + grep -q 'inet [1-9]' + grep 'release 6' /etc/redhat-release + ip link set test1p master testbr + ip link set test2p master testbr + systemctl is-active firewalld inactive + dnsmasq --pid-file=/run/dhcp_testbr.pid --dhcp-leasefile=/run/dhcp_testbr.lease --dhcp-range=192.0.2.1,192.0.2.254,240 --dhcp-range=2001:DB8::10,2001:DB8::1FF,slaac,64,240 --enable-ra --interface=testbr --bind-interfaces 11579 1726882180.38542: no more pending results, returning what we have 11579 1726882180.38546: results queue empty 11579 1726882180.38546: checking for any_errors_fatal 11579 1726882180.38552: done checking for any_errors_fatal 11579 1726882180.38552: checking for max_fail_percentage 11579 1726882180.38554: done checking for max_fail_percentage 11579 1726882180.38555: checking to see if all hosts have failed and the running result is not ok 11579 1726882180.38556: done checking to see if all hosts have failed 11579 1726882180.38556: getting the remaining hosts for this loop 11579 1726882180.38558: done getting the remaining hosts for this loop 11579 1726882180.38561: getting the next task for host managed_node1 11579 1726882180.38569: done getting next task for host managed_node1 11579 1726882180.38571: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11579 1726882180.38573: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882180.38577: getting variables 11579 1726882180.38578: in VariableManager get_vars() 11579 1726882180.38617: Calling all_inventory to load vars for managed_node1 11579 1726882180.38619: Calling groups_inventory to load vars for managed_node1 11579 1726882180.38621: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882180.38630: Calling all_plugins_play to load vars for managed_node1 11579 1726882180.38632: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882180.38634: Calling groups_plugins_play to load vars for managed_node1 11579 1726882180.38886: done sending task result for task 12673a56-9f93-f197-7423-000000000012 11579 1726882180.38889: WORKER PROCESS EXITING 11579 1726882180.38914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882180.39535: done with get_vars() 11579 1726882180.39601: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:29:40 -0400 (0:00:01.735) 0:00:09.106 ****** 11579 1726882180.39789: entering _queue_task() for managed_node1/include_tasks 11579 1726882180.40115: worker is 1 (out of 1 available) 11579 1726882180.40131: exiting _queue_task() for managed_node1/include_tasks 11579 1726882180.40143: done queuing things up, now waiting for results queue to drain 11579 1726882180.40145: waiting for pending results... 11579 1726882180.40462: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 11579 1726882180.40762: in run() - task 12673a56-9f93-f197-7423-000000000016 11579 1726882180.40783: variable 'ansible_search_path' from source: unknown 11579 1726882180.40791: variable 'ansible_search_path' from source: unknown 11579 1726882180.40837: calling self._execute() 11579 1726882180.40928: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882180.40939: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882180.40953: variable 'omit' from source: magic vars 11579 1726882180.41323: variable 'ansible_distribution_major_version' from source: facts 11579 1726882180.41339: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882180.41349: _execute() done 11579 1726882180.41357: dumping result to json 11579 1726882180.41364: done dumping result, returning 11579 1726882180.41376: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-f197-7423-000000000016] 11579 1726882180.41386: sending task result for task 12673a56-9f93-f197-7423-000000000016 11579 1726882180.41512: no more pending results, returning what we have 11579 1726882180.41517: in VariableManager get_vars() 11579 1726882180.41671: Calling all_inventory to load vars for managed_node1 11579 1726882180.41675: Calling groups_inventory to load vars for managed_node1 11579 1726882180.41677: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882180.41688: Calling all_plugins_play to load vars for managed_node1 11579 1726882180.41690: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882180.41694: Calling groups_plugins_play to load vars for managed_node1 11579 1726882180.41708: done sending task result for task 12673a56-9f93-f197-7423-000000000016 11579 1726882180.41711: WORKER PROCESS EXITING 11579 1726882180.41881: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882180.42095: done with get_vars() 11579 1726882180.42103: variable 'ansible_search_path' from source: unknown 11579 1726882180.42104: variable 'ansible_search_path' from source: unknown 11579 1726882180.42154: we have included files to process 11579 1726882180.42155: generating all_blocks data 11579 1726882180.42157: done generating all_blocks data 11579 1726882180.42158: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11579 1726882180.42160: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11579 1726882180.42162: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11579 1726882180.42409: done processing included file 11579 1726882180.42411: iterating over new_blocks loaded from include file 11579 1726882180.42412: in VariableManager get_vars() 11579 1726882180.42433: done with get_vars() 11579 1726882180.42434: filtering new block on tags 11579 1726882180.42459: done filtering new block on tags 11579 1726882180.42462: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 11579 1726882180.42466: extending task lists for all hosts with included blocks 11579 1726882180.42610: done extending task lists 11579 1726882180.42612: done processing included files 11579 1726882180.42612: results queue empty 11579 1726882180.42613: checking for any_errors_fatal 11579 1726882180.42618: done checking for any_errors_fatal 11579 1726882180.42619: checking for max_fail_percentage 11579 1726882180.42620: done checking for max_fail_percentage 11579 1726882180.42621: checking to see if all hosts have failed and the running result is not ok 11579 1726882180.42621: done checking to see if all hosts have failed 11579 1726882180.42622: getting the remaining hosts for this loop 11579 1726882180.42623: done getting the remaining hosts for this loop 11579 1726882180.42625: getting the next task for host managed_node1 11579 1726882180.42630: done getting next task for host managed_node1 11579 1726882180.42632: ^ task is: TASK: Get stat for interface {{ interface }} 11579 1726882180.42635: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882180.42637: getting variables 11579 1726882180.42638: in VariableManager get_vars() 11579 1726882180.42651: Calling all_inventory to load vars for managed_node1 11579 1726882180.42653: Calling groups_inventory to load vars for managed_node1 11579 1726882180.42655: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882180.42659: Calling all_plugins_play to load vars for managed_node1 11579 1726882180.42661: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882180.42671: Calling groups_plugins_play to load vars for managed_node1 11579 1726882180.42817: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882180.43019: done with get_vars() 11579 1726882180.43028: done getting variables 11579 1726882180.43192: variable 'interface' from source: task vars 11579 1726882180.43199: variable 'dhcp_interface1' from source: play vars 11579 1726882180.43271: variable 'dhcp_interface1' from source: play vars TASK [Get stat for interface test1] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:29:40 -0400 (0:00:00.035) 0:00:09.141 ****** 11579 1726882180.43316: entering _queue_task() for managed_node1/stat 11579 1726882180.43687: worker is 1 (out of 1 available) 11579 1726882180.43702: exiting _queue_task() for managed_node1/stat 11579 1726882180.43713: done queuing things up, now waiting for results queue to drain 11579 1726882180.43714: waiting for pending results... 11579 1726882180.43924: running TaskExecutor() for managed_node1/TASK: Get stat for interface test1 11579 1726882180.44053: in run() - task 12673a56-9f93-f197-7423-000000000152 11579 1726882180.44074: variable 'ansible_search_path' from source: unknown 11579 1726882180.44091: variable 'ansible_search_path' from source: unknown 11579 1726882180.44138: calling self._execute() 11579 1726882180.44232: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882180.44245: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882180.44259: variable 'omit' from source: magic vars 11579 1726882180.44641: variable 'ansible_distribution_major_version' from source: facts 11579 1726882180.44648: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882180.44660: variable 'omit' from source: magic vars 11579 1726882180.44749: variable 'omit' from source: magic vars 11579 1726882180.44828: variable 'interface' from source: task vars 11579 1726882180.44838: variable 'dhcp_interface1' from source: play vars 11579 1726882180.44912: variable 'dhcp_interface1' from source: play vars 11579 1726882180.44967: variable 'omit' from source: magic vars 11579 1726882180.44988: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882180.45031: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882180.45056: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882180.45088: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882180.45300: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882180.45304: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882180.45306: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882180.45308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882180.45311: Set connection var ansible_timeout to 10 11579 1726882180.45313: Set connection var ansible_shell_type to sh 11579 1726882180.45315: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882180.45317: Set connection var ansible_shell_executable to /bin/sh 11579 1726882180.45318: Set connection var ansible_pipelining to False 11579 1726882180.45321: Set connection var ansible_connection to ssh 11579 1726882180.45323: variable 'ansible_shell_executable' from source: unknown 11579 1726882180.45325: variable 'ansible_connection' from source: unknown 11579 1726882180.45327: variable 'ansible_module_compression' from source: unknown 11579 1726882180.45329: variable 'ansible_shell_type' from source: unknown 11579 1726882180.45331: variable 'ansible_shell_executable' from source: unknown 11579 1726882180.45333: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882180.45348: variable 'ansible_pipelining' from source: unknown 11579 1726882180.45356: variable 'ansible_timeout' from source: unknown 11579 1726882180.45364: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882180.45576: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11579 1726882180.45591: variable 'omit' from source: magic vars 11579 1726882180.45606: starting attempt loop 11579 1726882180.45612: running the handler 11579 1726882180.45631: _low_level_execute_command(): starting 11579 1726882180.45642: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882180.46400: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882180.46404: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882180.46407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882180.46410: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882180.46413: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882180.46416: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882180.46418: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882180.46423: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882180.46425: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 11579 1726882180.46436: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11579 1726882180.46439: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882180.46441: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882180.46443: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882180.46446: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882180.46448: stderr chunk (state=3): >>>debug2: match found <<< 11579 1726882180.46455: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882180.46528: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882180.46545: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882180.46561: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882180.46632: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882180.48208: stdout chunk (state=3): >>>/root <<< 11579 1726882180.48368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882180.48372: stdout chunk (state=3): >>><<< 11579 1726882180.48375: stderr chunk (state=3): >>><<< 11579 1726882180.48404: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882180.48509: _low_level_execute_command(): starting 11579 1726882180.48514: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882180.484122-12046-67508372843122 `" && echo ansible-tmp-1726882180.484122-12046-67508372843122="` echo /root/.ansible/tmp/ansible-tmp-1726882180.484122-12046-67508372843122 `" ) && sleep 0' 11579 1726882180.49185: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882180.49212: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882180.49292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882180.51184: stdout chunk (state=3): >>>ansible-tmp-1726882180.484122-12046-67508372843122=/root/.ansible/tmp/ansible-tmp-1726882180.484122-12046-67508372843122 <<< 11579 1726882180.51368: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882180.51372: stdout chunk (state=3): >>><<< 11579 1726882180.51374: stderr chunk (state=3): >>><<< 11579 1726882180.51603: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882180.484122-12046-67508372843122=/root/.ansible/tmp/ansible-tmp-1726882180.484122-12046-67508372843122 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882180.51607: variable 'ansible_module_compression' from source: unknown 11579 1726882180.51609: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11579 1726882180.51671: variable 'ansible_facts' from source: unknown 11579 1726882180.51811: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882180.484122-12046-67508372843122/AnsiballZ_stat.py 11579 1726882180.52047: Sending initial data 11579 1726882180.52152: Sent initial data (151 bytes) 11579 1726882180.52541: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882180.52544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 11579 1726882180.52546: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11579 1726882180.52548: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882180.52551: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882180.52589: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882180.52613: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882180.52650: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882180.54899: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882180.484122-12046-67508372843122/AnsiballZ_stat.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpw3clxbec" to remote "/root/.ansible/tmp/ansible-tmp-1726882180.484122-12046-67508372843122/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882180.484122-12046-67508372843122/AnsiballZ_stat.py" <<< 11579 1726882180.54903: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpw3clxbec /root/.ansible/tmp/ansible-tmp-1726882180.484122-12046-67508372843122/AnsiballZ_stat.py <<< 11579 1726882180.55647: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882180.55711: stderr chunk (state=3): >>><<< 11579 1726882180.55715: stdout chunk (state=3): >>><<< 11579 1726882180.55783: done transferring module to remote 11579 1726882180.55794: _low_level_execute_command(): starting 11579 1726882180.55802: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882180.484122-12046-67508372843122/ /root/.ansible/tmp/ansible-tmp-1726882180.484122-12046-67508372843122/AnsiballZ_stat.py && sleep 0' 11579 1726882180.56478: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882180.56513: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882180.56525: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882180.56533: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882180.56608: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882180.58348: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882180.58360: stderr chunk (state=3): >>><<< 11579 1726882180.58368: stdout chunk (state=3): >>><<< 11579 1726882180.58417: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882180.58433: _low_level_execute_command(): starting 11579 1726882180.58442: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882180.484122-12046-67508372843122/AnsiballZ_stat.py && sleep 0' 11579 1726882180.59083: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882180.59100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882180.59115: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882180.59134: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882180.59175: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882180.59224: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882180.59336: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882180.59373: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882180.59417: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882180.59470: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882180.74403: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26353, "dev": 23, "nlink": 1, "atime": 1726882179.1128848, "mtime": 1726882179.1128848, "ctime": 1726882179.1128848, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11579 1726882180.75709: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882180.75714: stdout chunk (state=3): >>><<< 11579 1726882180.75716: stderr chunk (state=3): >>><<< 11579 1726882180.76200: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test1", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26353, "dev": 23, "nlink": 1, "atime": 1726882179.1128848, "mtime": 1726882179.1128848, "ctime": 1726882179.1128848, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882180.76204: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882180.484122-12046-67508372843122/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882180.76212: _low_level_execute_command(): starting 11579 1726882180.76215: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882180.484122-12046-67508372843122/ > /dev/null 2>&1 && sleep 0' 11579 1726882180.76531: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882180.76539: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882180.76549: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882180.76601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882180.76605: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882180.76607: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882180.76610: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882180.76612: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882180.76614: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 11579 1726882180.76617: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11579 1726882180.76623: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882180.76633: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882180.76644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882180.76657: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882180.76660: stderr chunk (state=3): >>>debug2: match found <<< 11579 1726882180.76670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882180.76747: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882180.76750: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882180.76839: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882180.76907: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882180.79431: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882180.79434: stdout chunk (state=3): >>><<< 11579 1726882180.79437: stderr chunk (state=3): >>><<< 11579 1726882180.79439: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882180.79442: handler run complete 11579 1726882180.79444: attempt loop complete, returning result 11579 1726882180.79446: _execute() done 11579 1726882180.79448: dumping result to json 11579 1726882180.79450: done dumping result, returning 11579 1726882180.79452: done running TaskExecutor() for managed_node1/TASK: Get stat for interface test1 [12673a56-9f93-f197-7423-000000000152] 11579 1726882180.79454: sending task result for task 12673a56-9f93-f197-7423-000000000152 ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882179.1128848, "block_size": 4096, "blocks": 0, "ctime": 1726882179.1128848, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26353, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test1", "lnk_target": "../../devices/virtual/net/test1", "mode": "0777", "mtime": 1726882179.1128848, "nlink": 1, "path": "/sys/class/net/test1", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11579 1726882180.80050: no more pending results, returning what we have 11579 1726882180.80054: results queue empty 11579 1726882180.80055: checking for any_errors_fatal 11579 1726882180.80057: done checking for any_errors_fatal 11579 1726882180.80057: checking for max_fail_percentage 11579 1726882180.80059: done checking for max_fail_percentage 11579 1726882180.80060: checking to see if all hosts have failed and the running result is not ok 11579 1726882180.80061: done checking to see if all hosts have failed 11579 1726882180.80062: getting the remaining hosts for this loop 11579 1726882180.80064: done getting the remaining hosts for this loop 11579 1726882180.80067: getting the next task for host managed_node1 11579 1726882180.80077: done getting next task for host managed_node1 11579 1726882180.80080: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11579 1726882180.80083: ^ state is: HOST STATE: block=2, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882180.80088: getting variables 11579 1726882180.80089: in VariableManager get_vars() 11579 1726882180.80134: Calling all_inventory to load vars for managed_node1 11579 1726882180.80137: Calling groups_inventory to load vars for managed_node1 11579 1726882180.80140: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882180.80151: Calling all_plugins_play to load vars for managed_node1 11579 1726882180.80154: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882180.80157: Calling groups_plugins_play to load vars for managed_node1 11579 1726882180.81417: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882180.81623: done with get_vars() 11579 1726882180.81634: done getting variables 11579 1726882180.81800: done sending task result for task 12673a56-9f93-f197-7423-000000000152 11579 1726882180.81803: WORKER PROCESS EXITING 11579 1726882180.81877: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) 11579 1726882180.82033: variable 'interface' from source: task vars 11579 1726882180.82037: variable 'dhcp_interface1' from source: play vars 11579 1726882180.82101: variable 'dhcp_interface1' from source: play vars TASK [Assert that the interface is present - 'test1'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:29:40 -0400 (0:00:00.388) 0:00:09.529 ****** 11579 1726882180.82133: entering _queue_task() for managed_node1/assert 11579 1726882180.82134: Creating lock for assert 11579 1726882180.82454: worker is 1 (out of 1 available) 11579 1726882180.82468: exiting _queue_task() for managed_node1/assert 11579 1726882180.82482: done queuing things up, now waiting for results queue to drain 11579 1726882180.82483: waiting for pending results... 11579 1726882180.83042: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test1' 11579 1726882180.83153: in run() - task 12673a56-9f93-f197-7423-000000000017 11579 1726882180.83500: variable 'ansible_search_path' from source: unknown 11579 1726882180.83504: variable 'ansible_search_path' from source: unknown 11579 1726882180.83507: calling self._execute() 11579 1726882180.83536: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882180.83548: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882180.83563: variable 'omit' from source: magic vars 11579 1726882180.84499: variable 'ansible_distribution_major_version' from source: facts 11579 1726882180.84502: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882180.84504: variable 'omit' from source: magic vars 11579 1726882180.84506: variable 'omit' from source: magic vars 11579 1726882180.84508: variable 'interface' from source: task vars 11579 1726882180.84509: variable 'dhcp_interface1' from source: play vars 11579 1726882180.84724: variable 'dhcp_interface1' from source: play vars 11579 1726882180.84747: variable 'omit' from source: magic vars 11579 1726882180.84790: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882180.84835: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882180.84860: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882180.85119: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882180.85299: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882180.85303: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882180.85305: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882180.85307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882180.85309: Set connection var ansible_timeout to 10 11579 1726882180.85311: Set connection var ansible_shell_type to sh 11579 1726882180.85313: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882180.85315: Set connection var ansible_shell_executable to /bin/sh 11579 1726882180.85326: Set connection var ansible_pipelining to False 11579 1726882180.85333: Set connection var ansible_connection to ssh 11579 1726882180.85359: variable 'ansible_shell_executable' from source: unknown 11579 1726882180.85800: variable 'ansible_connection' from source: unknown 11579 1726882180.85803: variable 'ansible_module_compression' from source: unknown 11579 1726882180.85805: variable 'ansible_shell_type' from source: unknown 11579 1726882180.85808: variable 'ansible_shell_executable' from source: unknown 11579 1726882180.85810: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882180.85812: variable 'ansible_pipelining' from source: unknown 11579 1726882180.85814: variable 'ansible_timeout' from source: unknown 11579 1726882180.85816: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882180.85819: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882180.85821: variable 'omit' from source: magic vars 11579 1726882180.85823: starting attempt loop 11579 1726882180.85825: running the handler 11579 1726882180.85947: variable 'interface_stat' from source: set_fact 11579 1726882180.86221: Evaluated conditional (interface_stat.stat.exists): True 11579 1726882180.86234: handler run complete 11579 1726882180.86253: attempt loop complete, returning result 11579 1726882180.86260: _execute() done 11579 1726882180.86270: dumping result to json 11579 1726882180.86278: done dumping result, returning 11579 1726882180.86296: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test1' [12673a56-9f93-f197-7423-000000000017] 11579 1726882180.86307: sending task result for task 12673a56-9f93-f197-7423-000000000017 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11579 1726882180.86451: no more pending results, returning what we have 11579 1726882180.86454: results queue empty 11579 1726882180.86455: checking for any_errors_fatal 11579 1726882180.86464: done checking for any_errors_fatal 11579 1726882180.86464: checking for max_fail_percentage 11579 1726882180.86466: done checking for max_fail_percentage 11579 1726882180.86467: checking to see if all hosts have failed and the running result is not ok 11579 1726882180.86468: done checking to see if all hosts have failed 11579 1726882180.86468: getting the remaining hosts for this loop 11579 1726882180.86470: done getting the remaining hosts for this loop 11579 1726882180.86472: getting the next task for host managed_node1 11579 1726882180.86480: done getting next task for host managed_node1 11579 1726882180.86482: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11579 1726882180.86697: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882180.86702: getting variables 11579 1726882180.86704: in VariableManager get_vars() 11579 1726882180.86746: Calling all_inventory to load vars for managed_node1 11579 1726882180.86749: Calling groups_inventory to load vars for managed_node1 11579 1726882180.86752: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882180.86764: Calling all_plugins_play to load vars for managed_node1 11579 1726882180.86767: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882180.86770: Calling groups_plugins_play to load vars for managed_node1 11579 1726882180.86974: done sending task result for task 12673a56-9f93-f197-7423-000000000017 11579 1726882180.86978: WORKER PROCESS EXITING 11579 1726882180.87114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882180.87803: done with get_vars() 11579 1726882180.87816: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:29:40 -0400 (0:00:00.057) 0:00:09.587 ****** 11579 1726882180.87912: entering _queue_task() for managed_node1/include_tasks 11579 1726882180.88761: worker is 1 (out of 1 available) 11579 1726882180.88913: exiting _queue_task() for managed_node1/include_tasks 11579 1726882180.88924: done queuing things up, now waiting for results queue to drain 11579 1726882180.88925: waiting for pending results... 11579 1726882180.89038: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 11579 1726882180.89157: in run() - task 12673a56-9f93-f197-7423-00000000001b 11579 1726882180.89176: variable 'ansible_search_path' from source: unknown 11579 1726882180.89185: variable 'ansible_search_path' from source: unknown 11579 1726882180.89234: calling self._execute() 11579 1726882180.89322: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882180.89339: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882180.89355: variable 'omit' from source: magic vars 11579 1726882180.89771: variable 'ansible_distribution_major_version' from source: facts 11579 1726882180.89787: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882180.90345: _execute() done 11579 1726882180.90348: dumping result to json 11579 1726882180.90351: done dumping result, returning 11579 1726882180.90353: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-f197-7423-00000000001b] 11579 1726882180.90355: sending task result for task 12673a56-9f93-f197-7423-00000000001b 11579 1726882180.90451: no more pending results, returning what we have 11579 1726882180.90457: in VariableManager get_vars() 11579 1726882180.90505: Calling all_inventory to load vars for managed_node1 11579 1726882180.90507: Calling groups_inventory to load vars for managed_node1 11579 1726882180.90510: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882180.90524: Calling all_plugins_play to load vars for managed_node1 11579 1726882180.90527: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882180.90531: Calling groups_plugins_play to load vars for managed_node1 11579 1726882180.91051: done sending task result for task 12673a56-9f93-f197-7423-00000000001b 11579 1726882180.91055: WORKER PROCESS EXITING 11579 1726882180.91068: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882180.91265: done with get_vars() 11579 1726882180.91273: variable 'ansible_search_path' from source: unknown 11579 1726882180.91274: variable 'ansible_search_path' from source: unknown 11579 1726882180.91617: we have included files to process 11579 1726882180.91618: generating all_blocks data 11579 1726882180.91621: done generating all_blocks data 11579 1726882180.91625: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11579 1726882180.91626: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11579 1726882180.91629: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11579 1726882180.91945: done processing included file 11579 1726882180.91947: iterating over new_blocks loaded from include file 11579 1726882180.91949: in VariableManager get_vars() 11579 1726882180.91968: done with get_vars() 11579 1726882180.91969: filtering new block on tags 11579 1726882180.91984: done filtering new block on tags 11579 1726882180.91986: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 11579 1726882180.91991: extending task lists for all hosts with included blocks 11579 1726882180.92091: done extending task lists 11579 1726882180.92373: done processing included files 11579 1726882180.92374: results queue empty 11579 1726882180.92375: checking for any_errors_fatal 11579 1726882180.92378: done checking for any_errors_fatal 11579 1726882180.92379: checking for max_fail_percentage 11579 1726882180.92380: done checking for max_fail_percentage 11579 1726882180.92380: checking to see if all hosts have failed and the running result is not ok 11579 1726882180.92381: done checking to see if all hosts have failed 11579 1726882180.92382: getting the remaining hosts for this loop 11579 1726882180.92383: done getting the remaining hosts for this loop 11579 1726882180.92386: getting the next task for host managed_node1 11579 1726882180.92391: done getting next task for host managed_node1 11579 1726882180.92396: ^ task is: TASK: Get stat for interface {{ interface }} 11579 1726882180.92399: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882180.92401: getting variables 11579 1726882180.92402: in VariableManager get_vars() 11579 1726882180.92415: Calling all_inventory to load vars for managed_node1 11579 1726882180.92417: Calling groups_inventory to load vars for managed_node1 11579 1726882180.92420: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882180.92425: Calling all_plugins_play to load vars for managed_node1 11579 1726882180.92427: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882180.92430: Calling groups_plugins_play to load vars for managed_node1 11579 1726882180.92570: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882180.93636: done with get_vars() 11579 1726882180.93645: done getting variables 11579 1726882180.93787: variable 'interface' from source: task vars 11579 1726882180.93791: variable 'dhcp_interface2' from source: play vars 11579 1726882180.93852: variable 'dhcp_interface2' from source: play vars TASK [Get stat for interface test2] ******************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:29:40 -0400 (0:00:00.059) 0:00:09.647 ****** 11579 1726882180.93880: entering _queue_task() for managed_node1/stat 11579 1726882180.94153: worker is 1 (out of 1 available) 11579 1726882180.94164: exiting _queue_task() for managed_node1/stat 11579 1726882180.94174: done queuing things up, now waiting for results queue to drain 11579 1726882180.94175: waiting for pending results... 11579 1726882180.94427: running TaskExecutor() for managed_node1/TASK: Get stat for interface test2 11579 1726882180.94549: in run() - task 12673a56-9f93-f197-7423-00000000016a 11579 1726882180.94566: variable 'ansible_search_path' from source: unknown 11579 1726882180.94572: variable 'ansible_search_path' from source: unknown 11579 1726882180.94615: calling self._execute() 11579 1726882180.94702: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882180.94713: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882180.94727: variable 'omit' from source: magic vars 11579 1726882180.95500: variable 'ansible_distribution_major_version' from source: facts 11579 1726882180.95504: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882180.95506: variable 'omit' from source: magic vars 11579 1726882180.95743: variable 'omit' from source: magic vars 11579 1726882180.95746: variable 'interface' from source: task vars 11579 1726882180.95749: variable 'dhcp_interface2' from source: play vars 11579 1726882180.95887: variable 'dhcp_interface2' from source: play vars 11579 1726882180.95980: variable 'omit' from source: magic vars 11579 1726882180.96030: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882180.96111: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882180.96202: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882180.96226: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882180.96244: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882180.96321: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882180.96512: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882180.96515: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882180.96626: Set connection var ansible_timeout to 10 11579 1726882180.96638: Set connection var ansible_shell_type to sh 11579 1726882180.96651: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882180.96661: Set connection var ansible_shell_executable to /bin/sh 11579 1726882180.96673: Set connection var ansible_pipelining to False 11579 1726882180.96680: Set connection var ansible_connection to ssh 11579 1726882180.96710: variable 'ansible_shell_executable' from source: unknown 11579 1726882180.96733: variable 'ansible_connection' from source: unknown 11579 1726882180.96836: variable 'ansible_module_compression' from source: unknown 11579 1726882180.96839: variable 'ansible_shell_type' from source: unknown 11579 1726882180.96841: variable 'ansible_shell_executable' from source: unknown 11579 1726882180.96843: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882180.96845: variable 'ansible_pipelining' from source: unknown 11579 1726882180.96848: variable 'ansible_timeout' from source: unknown 11579 1726882180.96850: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882180.97271: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11579 1726882180.97284: variable 'omit' from source: magic vars 11579 1726882180.97299: starting attempt loop 11579 1726882180.97307: running the handler 11579 1726882180.97326: _low_level_execute_command(): starting 11579 1726882180.97338: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882180.98783: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882180.98810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882180.99035: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882180.99078: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882181.00689: stdout chunk (state=3): >>>/root <<< 11579 1726882181.00959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882181.00964: stdout chunk (state=3): >>><<< 11579 1726882181.00977: stderr chunk (state=3): >>><<< 11579 1726882181.00998: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882181.01012: _low_level_execute_command(): starting 11579 1726882181.01020: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882181.0099766-12073-276300397080055 `" && echo ansible-tmp-1726882181.0099766-12073-276300397080055="` echo /root/.ansible/tmp/ansible-tmp-1726882181.0099766-12073-276300397080055 `" ) && sleep 0' 11579 1726882181.02401: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882181.02531: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882181.02571: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882181.04429: stdout chunk (state=3): >>>ansible-tmp-1726882181.0099766-12073-276300397080055=/root/.ansible/tmp/ansible-tmp-1726882181.0099766-12073-276300397080055 <<< 11579 1726882181.04571: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882181.04575: stdout chunk (state=3): >>><<< 11579 1726882181.04582: stderr chunk (state=3): >>><<< 11579 1726882181.04607: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882181.0099766-12073-276300397080055=/root/.ansible/tmp/ansible-tmp-1726882181.0099766-12073-276300397080055 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882181.04710: variable 'ansible_module_compression' from source: unknown 11579 1726882181.04810: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11579 1726882181.04848: variable 'ansible_facts' from source: unknown 11579 1726882181.05145: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882181.0099766-12073-276300397080055/AnsiballZ_stat.py 11579 1726882181.05567: Sending initial data 11579 1726882181.05570: Sent initial data (153 bytes) 11579 1726882181.06098: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882181.06112: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882181.06236: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882181.06240: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882181.06242: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882181.06281: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882181.07809: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11579 1726882181.08067: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11579 1726882181.08111: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882181.08140: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882181.08400: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmp1aosp950 /root/.ansible/tmp/ansible-tmp-1726882181.0099766-12073-276300397080055/AnsiballZ_stat.py <<< 11579 1726882181.08403: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882181.0099766-12073-276300397080055/AnsiballZ_stat.py" <<< 11579 1726882181.08428: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmp1aosp950" to remote "/root/.ansible/tmp/ansible-tmp-1726882181.0099766-12073-276300397080055/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882181.0099766-12073-276300397080055/AnsiballZ_stat.py" <<< 11579 1726882181.09437: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882181.09485: stderr chunk (state=3): >>><<< 11579 1726882181.09498: stdout chunk (state=3): >>><<< 11579 1726882181.09525: done transferring module to remote 11579 1726882181.09539: _low_level_execute_command(): starting 11579 1726882181.09548: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882181.0099766-12073-276300397080055/ /root/.ansible/tmp/ansible-tmp-1726882181.0099766-12073-276300397080055/AnsiballZ_stat.py && sleep 0' 11579 1726882181.10127: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882181.10139: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882181.10152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882181.10168: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882181.10207: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 11579 1726882181.10221: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882181.10290: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882181.10322: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882181.10335: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882181.10406: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882181.12314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882181.12319: stdout chunk (state=3): >>><<< 11579 1726882181.12505: stderr chunk (state=3): >>><<< 11579 1726882181.12509: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882181.12512: _low_level_execute_command(): starting 11579 1726882181.12514: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882181.0099766-12073-276300397080055/AnsiballZ_stat.py && sleep 0' 11579 1726882181.13843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882181.14007: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882181.14031: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882181.14126: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882181.29201: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26759, "dev": 23, "nlink": 1, "atime": 1726882179.1161113, "mtime": 1726882179.1161113, "ctime": 1726882179.1161113, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11579 1726882181.30469: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882181.30482: stderr chunk (state=3): >>><<< 11579 1726882181.30533: stdout chunk (state=3): >>><<< 11579 1726882181.30537: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/test2", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 26759, "dev": 23, "nlink": 1, "atime": 1726882179.1161113, "mtime": 1726882179.1161113, "ctime": 1726882179.1161113, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/test2", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882181.30606: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/test2', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882181.0099766-12073-276300397080055/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882181.30610: _low_level_execute_command(): starting 11579 1726882181.30612: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882181.0099766-12073-276300397080055/ > /dev/null 2>&1 && sleep 0' 11579 1726882181.31022: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882181.31026: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 11579 1726882181.31029: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11579 1726882181.31037: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882181.31039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882181.31079: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882181.31088: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882181.31129: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882181.32929: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882181.32950: stderr chunk (state=3): >>><<< 11579 1726882181.32954: stdout chunk (state=3): >>><<< 11579 1726882181.32967: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882181.32974: handler run complete 11579 1726882181.33010: attempt loop complete, returning result 11579 1726882181.33013: _execute() done 11579 1726882181.33016: dumping result to json 11579 1726882181.33021: done dumping result, returning 11579 1726882181.33028: done running TaskExecutor() for managed_node1/TASK: Get stat for interface test2 [12673a56-9f93-f197-7423-00000000016a] 11579 1726882181.33032: sending task result for task 12673a56-9f93-f197-7423-00000000016a 11579 1726882181.33134: done sending task result for task 12673a56-9f93-f197-7423-00000000016a 11579 1726882181.33137: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882179.1161113, "block_size": 4096, "blocks": 0, "ctime": 1726882179.1161113, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 26759, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/test2", "lnk_target": "../../devices/virtual/net/test2", "mode": "0777", "mtime": 1726882179.1161113, "nlink": 1, "path": "/sys/class/net/test2", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11579 1726882181.33222: no more pending results, returning what we have 11579 1726882181.33225: results queue empty 11579 1726882181.33225: checking for any_errors_fatal 11579 1726882181.33227: done checking for any_errors_fatal 11579 1726882181.33227: checking for max_fail_percentage 11579 1726882181.33229: done checking for max_fail_percentage 11579 1726882181.33229: checking to see if all hosts have failed and the running result is not ok 11579 1726882181.33231: done checking to see if all hosts have failed 11579 1726882181.33231: getting the remaining hosts for this loop 11579 1726882181.33233: done getting the remaining hosts for this loop 11579 1726882181.33236: getting the next task for host managed_node1 11579 1726882181.33243: done getting next task for host managed_node1 11579 1726882181.33247: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11579 1726882181.33250: ^ state is: HOST STATE: block=2, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882181.33254: getting variables 11579 1726882181.33255: in VariableManager get_vars() 11579 1726882181.33291: Calling all_inventory to load vars for managed_node1 11579 1726882181.33301: Calling groups_inventory to load vars for managed_node1 11579 1726882181.33304: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882181.33313: Calling all_plugins_play to load vars for managed_node1 11579 1726882181.33315: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882181.33318: Calling groups_plugins_play to load vars for managed_node1 11579 1726882181.33450: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882181.33576: done with get_vars() 11579 1726882181.33584: done getting variables 11579 1726882181.33632: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882181.33717: variable 'interface' from source: task vars 11579 1726882181.33721: variable 'dhcp_interface2' from source: play vars 11579 1726882181.33763: variable 'dhcp_interface2' from source: play vars TASK [Assert that the interface is present - 'test2'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:29:41 -0400 (0:00:00.399) 0:00:10.046 ****** 11579 1726882181.33785: entering _queue_task() for managed_node1/assert 11579 1726882181.33970: worker is 1 (out of 1 available) 11579 1726882181.33982: exiting _queue_task() for managed_node1/assert 11579 1726882181.33994: done queuing things up, now waiting for results queue to drain 11579 1726882181.33996: waiting for pending results... 11579 1726882181.34145: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test2' 11579 1726882181.34212: in run() - task 12673a56-9f93-f197-7423-00000000001c 11579 1726882181.34224: variable 'ansible_search_path' from source: unknown 11579 1726882181.34230: variable 'ansible_search_path' from source: unknown 11579 1726882181.34261: calling self._execute() 11579 1726882181.34326: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882181.34338: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882181.34342: variable 'omit' from source: magic vars 11579 1726882181.34644: variable 'ansible_distribution_major_version' from source: facts 11579 1726882181.34653: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882181.34666: variable 'omit' from source: magic vars 11579 1726882181.34691: variable 'omit' from source: magic vars 11579 1726882181.34756: variable 'interface' from source: task vars 11579 1726882181.34760: variable 'dhcp_interface2' from source: play vars 11579 1726882181.34811: variable 'dhcp_interface2' from source: play vars 11579 1726882181.34825: variable 'omit' from source: magic vars 11579 1726882181.34856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882181.34884: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882181.34901: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882181.34914: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882181.34924: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882181.34946: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882181.34949: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882181.34952: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882181.35030: Set connection var ansible_timeout to 10 11579 1726882181.35033: Set connection var ansible_shell_type to sh 11579 1726882181.35041: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882181.35045: Set connection var ansible_shell_executable to /bin/sh 11579 1726882181.35051: Set connection var ansible_pipelining to False 11579 1726882181.35054: Set connection var ansible_connection to ssh 11579 1726882181.35069: variable 'ansible_shell_executable' from source: unknown 11579 1726882181.35071: variable 'ansible_connection' from source: unknown 11579 1726882181.35074: variable 'ansible_module_compression' from source: unknown 11579 1726882181.35076: variable 'ansible_shell_type' from source: unknown 11579 1726882181.35078: variable 'ansible_shell_executable' from source: unknown 11579 1726882181.35080: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882181.35085: variable 'ansible_pipelining' from source: unknown 11579 1726882181.35088: variable 'ansible_timeout' from source: unknown 11579 1726882181.35103: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882181.35184: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882181.35191: variable 'omit' from source: magic vars 11579 1726882181.35197: starting attempt loop 11579 1726882181.35200: running the handler 11579 1726882181.35287: variable 'interface_stat' from source: set_fact 11579 1726882181.35303: Evaluated conditional (interface_stat.stat.exists): True 11579 1726882181.35308: handler run complete 11579 1726882181.35322: attempt loop complete, returning result 11579 1726882181.35325: _execute() done 11579 1726882181.35328: dumping result to json 11579 1726882181.35330: done dumping result, returning 11579 1726882181.35335: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'test2' [12673a56-9f93-f197-7423-00000000001c] 11579 1726882181.35341: sending task result for task 12673a56-9f93-f197-7423-00000000001c 11579 1726882181.35416: done sending task result for task 12673a56-9f93-f197-7423-00000000001c 11579 1726882181.35419: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11579 1726882181.35478: no more pending results, returning what we have 11579 1726882181.35481: results queue empty 11579 1726882181.35482: checking for any_errors_fatal 11579 1726882181.35487: done checking for any_errors_fatal 11579 1726882181.35488: checking for max_fail_percentage 11579 1726882181.35489: done checking for max_fail_percentage 11579 1726882181.35490: checking to see if all hosts have failed and the running result is not ok 11579 1726882181.35491: done checking to see if all hosts have failed 11579 1726882181.35491: getting the remaining hosts for this loop 11579 1726882181.35497: done getting the remaining hosts for this loop 11579 1726882181.35500: getting the next task for host managed_node1 11579 1726882181.35506: done getting next task for host managed_node1 11579 1726882181.35508: ^ task is: TASK: Backup the /etc/resolv.conf for initscript 11579 1726882181.35510: ^ state is: HOST STATE: block=2, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882181.35513: getting variables 11579 1726882181.35514: in VariableManager get_vars() 11579 1726882181.35546: Calling all_inventory to load vars for managed_node1 11579 1726882181.35549: Calling groups_inventory to load vars for managed_node1 11579 1726882181.35551: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882181.35559: Calling all_plugins_play to load vars for managed_node1 11579 1726882181.35561: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882181.35564: Calling groups_plugins_play to load vars for managed_node1 11579 1726882181.35707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882181.35819: done with get_vars() 11579 1726882181.35826: done getting variables 11579 1726882181.35861: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Backup the /etc/resolv.conf for initscript] ****************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:28 Friday 20 September 2024 21:29:41 -0400 (0:00:00.020) 0:00:10.067 ****** 11579 1726882181.35879: entering _queue_task() for managed_node1/command 11579 1726882181.36051: worker is 1 (out of 1 available) 11579 1726882181.36064: exiting _queue_task() for managed_node1/command 11579 1726882181.36075: done queuing things up, now waiting for results queue to drain 11579 1726882181.36076: waiting for pending results... 11579 1726882181.36215: running TaskExecutor() for managed_node1/TASK: Backup the /etc/resolv.conf for initscript 11579 1726882181.36262: in run() - task 12673a56-9f93-f197-7423-00000000001d 11579 1726882181.36273: variable 'ansible_search_path' from source: unknown 11579 1726882181.36302: calling self._execute() 11579 1726882181.36600: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882181.36604: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882181.36606: variable 'omit' from source: magic vars 11579 1726882181.36764: variable 'ansible_distribution_major_version' from source: facts 11579 1726882181.36783: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882181.36911: variable 'network_provider' from source: set_fact 11579 1726882181.36924: Evaluated conditional (network_provider == "initscripts"): False 11579 1726882181.36933: when evaluation is False, skipping this task 11579 1726882181.36953: _execute() done 11579 1726882181.36961: dumping result to json 11579 1726882181.36969: done dumping result, returning 11579 1726882181.36981: done running TaskExecutor() for managed_node1/TASK: Backup the /etc/resolv.conf for initscript [12673a56-9f93-f197-7423-00000000001d] 11579 1726882181.37057: sending task result for task 12673a56-9f93-f197-7423-00000000001d 11579 1726882181.37129: done sending task result for task 12673a56-9f93-f197-7423-00000000001d 11579 1726882181.37132: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11579 1726882181.37212: no more pending results, returning what we have 11579 1726882181.37216: results queue empty 11579 1726882181.37217: checking for any_errors_fatal 11579 1726882181.37224: done checking for any_errors_fatal 11579 1726882181.37225: checking for max_fail_percentage 11579 1726882181.37227: done checking for max_fail_percentage 11579 1726882181.37227: checking to see if all hosts have failed and the running result is not ok 11579 1726882181.37229: done checking to see if all hosts have failed 11579 1726882181.37229: getting the remaining hosts for this loop 11579 1726882181.37231: done getting the remaining hosts for this loop 11579 1726882181.37234: getting the next task for host managed_node1 11579 1726882181.37240: done getting next task for host managed_node1 11579 1726882181.37243: ^ task is: TASK: TEST Add Bond with 2 ports 11579 1726882181.37245: ^ state is: HOST STATE: block=2, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882181.37248: getting variables 11579 1726882181.37250: in VariableManager get_vars() 11579 1726882181.37306: Calling all_inventory to load vars for managed_node1 11579 1726882181.37309: Calling groups_inventory to load vars for managed_node1 11579 1726882181.37312: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882181.37328: Calling all_plugins_play to load vars for managed_node1 11579 1726882181.37331: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882181.37334: Calling groups_plugins_play to load vars for managed_node1 11579 1726882181.37682: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882181.37898: done with get_vars() 11579 1726882181.37909: done getting variables 11579 1726882181.37966: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [TEST Add Bond with 2 ports] ********************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:33 Friday 20 September 2024 21:29:41 -0400 (0:00:00.021) 0:00:10.088 ****** 11579 1726882181.37996: entering _queue_task() for managed_node1/debug 11579 1726882181.38225: worker is 1 (out of 1 available) 11579 1726882181.38237: exiting _queue_task() for managed_node1/debug 11579 1726882181.38246: done queuing things up, now waiting for results queue to drain 11579 1726882181.38248: waiting for pending results... 11579 1726882181.38605: running TaskExecutor() for managed_node1/TASK: TEST Add Bond with 2 ports 11579 1726882181.38613: in run() - task 12673a56-9f93-f197-7423-00000000001e 11579 1726882181.38634: variable 'ansible_search_path' from source: unknown 11579 1726882181.38675: calling self._execute() 11579 1726882181.38770: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882181.38784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882181.38901: variable 'omit' from source: magic vars 11579 1726882181.39271: variable 'ansible_distribution_major_version' from source: facts 11579 1726882181.39288: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882181.39304: variable 'omit' from source: magic vars 11579 1726882181.39328: variable 'omit' from source: magic vars 11579 1726882181.39377: variable 'omit' from source: magic vars 11579 1726882181.39420: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882181.39468: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882181.39496: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882181.39520: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882181.39536: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882181.39578: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882181.39587: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882181.39601: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882181.39715: Set connection var ansible_timeout to 10 11579 1726882181.39729: Set connection var ansible_shell_type to sh 11579 1726882181.39790: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882181.39797: Set connection var ansible_shell_executable to /bin/sh 11579 1726882181.39800: Set connection var ansible_pipelining to False 11579 1726882181.39803: Set connection var ansible_connection to ssh 11579 1726882181.39805: variable 'ansible_shell_executable' from source: unknown 11579 1726882181.39807: variable 'ansible_connection' from source: unknown 11579 1726882181.39810: variable 'ansible_module_compression' from source: unknown 11579 1726882181.39820: variable 'ansible_shell_type' from source: unknown 11579 1726882181.39828: variable 'ansible_shell_executable' from source: unknown 11579 1726882181.39836: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882181.39844: variable 'ansible_pipelining' from source: unknown 11579 1726882181.39904: variable 'ansible_timeout' from source: unknown 11579 1726882181.39908: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882181.40027: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882181.40046: variable 'omit' from source: magic vars 11579 1726882181.40057: starting attempt loop 11579 1726882181.40065: running the handler 11579 1726882181.40132: handler run complete 11579 1726882181.40200: attempt loop complete, returning result 11579 1726882181.40203: _execute() done 11579 1726882181.40206: dumping result to json 11579 1726882181.40209: done dumping result, returning 11579 1726882181.40211: done running TaskExecutor() for managed_node1/TASK: TEST Add Bond with 2 ports [12673a56-9f93-f197-7423-00000000001e] 11579 1726882181.40219: sending task result for task 12673a56-9f93-f197-7423-00000000001e ok: [managed_node1] => {} MSG: ################################################## 11579 1726882181.40375: no more pending results, returning what we have 11579 1726882181.40379: results queue empty 11579 1726882181.40380: checking for any_errors_fatal 11579 1726882181.40385: done checking for any_errors_fatal 11579 1726882181.40385: checking for max_fail_percentage 11579 1726882181.40387: done checking for max_fail_percentage 11579 1726882181.40388: checking to see if all hosts have failed and the running result is not ok 11579 1726882181.40389: done checking to see if all hosts have failed 11579 1726882181.40390: getting the remaining hosts for this loop 11579 1726882181.40391: done getting the remaining hosts for this loop 11579 1726882181.40399: getting the next task for host managed_node1 11579 1726882181.40407: done getting next task for host managed_node1 11579 1726882181.40412: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11579 1726882181.40415: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882181.40432: getting variables 11579 1726882181.40434: in VariableManager get_vars() 11579 1726882181.40472: Calling all_inventory to load vars for managed_node1 11579 1726882181.40476: Calling groups_inventory to load vars for managed_node1 11579 1726882181.40478: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882181.40488: Calling all_plugins_play to load vars for managed_node1 11579 1726882181.40491: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882181.40701: Calling groups_plugins_play to load vars for managed_node1 11579 1726882181.40901: done sending task result for task 12673a56-9f93-f197-7423-00000000001e 11579 1726882181.40904: WORKER PROCESS EXITING 11579 1726882181.40914: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882181.41035: done with get_vars() 11579 1726882181.41044: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:29:41 -0400 (0:00:00.031) 0:00:10.119 ****** 11579 1726882181.41107: entering _queue_task() for managed_node1/include_tasks 11579 1726882181.41284: worker is 1 (out of 1 available) 11579 1726882181.41301: exiting _queue_task() for managed_node1/include_tasks 11579 1726882181.41310: done queuing things up, now waiting for results queue to drain 11579 1726882181.41311: waiting for pending results... 11579 1726882181.41457: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11579 1726882181.41528: in run() - task 12673a56-9f93-f197-7423-000000000026 11579 1726882181.41540: variable 'ansible_search_path' from source: unknown 11579 1726882181.41546: variable 'ansible_search_path' from source: unknown 11579 1726882181.41573: calling self._execute() 11579 1726882181.41634: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882181.41637: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882181.41651: variable 'omit' from source: magic vars 11579 1726882181.41895: variable 'ansible_distribution_major_version' from source: facts 11579 1726882181.41906: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882181.41911: _execute() done 11579 1726882181.41914: dumping result to json 11579 1726882181.41917: done dumping result, returning 11579 1726882181.41924: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-f197-7423-000000000026] 11579 1726882181.41928: sending task result for task 12673a56-9f93-f197-7423-000000000026 11579 1726882181.42047: no more pending results, returning what we have 11579 1726882181.42051: in VariableManager get_vars() 11579 1726882181.42088: Calling all_inventory to load vars for managed_node1 11579 1726882181.42090: Calling groups_inventory to load vars for managed_node1 11579 1726882181.42092: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882181.42103: Calling all_plugins_play to load vars for managed_node1 11579 1726882181.42106: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882181.42109: Calling groups_plugins_play to load vars for managed_node1 11579 1726882181.42217: done sending task result for task 12673a56-9f93-f197-7423-000000000026 11579 1726882181.42220: WORKER PROCESS EXITING 11579 1726882181.42230: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882181.42372: done with get_vars() 11579 1726882181.42380: variable 'ansible_search_path' from source: unknown 11579 1726882181.42381: variable 'ansible_search_path' from source: unknown 11579 1726882181.42420: we have included files to process 11579 1726882181.42422: generating all_blocks data 11579 1726882181.42423: done generating all_blocks data 11579 1726882181.42426: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11579 1726882181.42427: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11579 1726882181.42430: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11579 1726882181.43091: done processing included file 11579 1726882181.43095: iterating over new_blocks loaded from include file 11579 1726882181.43096: in VariableManager get_vars() 11579 1726882181.43119: done with get_vars() 11579 1726882181.43121: filtering new block on tags 11579 1726882181.43137: done filtering new block on tags 11579 1726882181.43139: in VariableManager get_vars() 11579 1726882181.43161: done with get_vars() 11579 1726882181.43171: filtering new block on tags 11579 1726882181.43213: done filtering new block on tags 11579 1726882181.43216: in VariableManager get_vars() 11579 1726882181.43252: done with get_vars() 11579 1726882181.43254: filtering new block on tags 11579 1726882181.43278: done filtering new block on tags 11579 1726882181.43280: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 11579 1726882181.43286: extending task lists for all hosts with included blocks 11579 1726882181.43845: done extending task lists 11579 1726882181.43846: done processing included files 11579 1726882181.43847: results queue empty 11579 1726882181.43847: checking for any_errors_fatal 11579 1726882181.43849: done checking for any_errors_fatal 11579 1726882181.43850: checking for max_fail_percentage 11579 1726882181.43850: done checking for max_fail_percentage 11579 1726882181.43851: checking to see if all hosts have failed and the running result is not ok 11579 1726882181.43851: done checking to see if all hosts have failed 11579 1726882181.43852: getting the remaining hosts for this loop 11579 1726882181.43853: done getting the remaining hosts for this loop 11579 1726882181.43854: getting the next task for host managed_node1 11579 1726882181.43857: done getting next task for host managed_node1 11579 1726882181.43858: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11579 1726882181.43860: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882181.43866: getting variables 11579 1726882181.43866: in VariableManager get_vars() 11579 1726882181.43876: Calling all_inventory to load vars for managed_node1 11579 1726882181.43877: Calling groups_inventory to load vars for managed_node1 11579 1726882181.43878: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882181.43881: Calling all_plugins_play to load vars for managed_node1 11579 1726882181.43883: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882181.43884: Calling groups_plugins_play to load vars for managed_node1 11579 1726882181.43980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882181.44091: done with get_vars() 11579 1726882181.44099: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:29:41 -0400 (0:00:00.030) 0:00:10.149 ****** 11579 1726882181.44146: entering _queue_task() for managed_node1/setup 11579 1726882181.44324: worker is 1 (out of 1 available) 11579 1726882181.44335: exiting _queue_task() for managed_node1/setup 11579 1726882181.44344: done queuing things up, now waiting for results queue to drain 11579 1726882181.44346: waiting for pending results... 11579 1726882181.44494: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11579 1726882181.44579: in run() - task 12673a56-9f93-f197-7423-000000000188 11579 1726882181.44591: variable 'ansible_search_path' from source: unknown 11579 1726882181.44596: variable 'ansible_search_path' from source: unknown 11579 1726882181.44624: calling self._execute() 11579 1726882181.44677: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882181.44681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882181.44692: variable 'omit' from source: magic vars 11579 1726882181.44934: variable 'ansible_distribution_major_version' from source: facts 11579 1726882181.44943: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882181.45077: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882181.47100: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882181.47104: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882181.47109: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882181.47150: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882181.47200: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882181.47285: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882181.47323: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882181.47352: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882181.47399: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882181.47419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882181.47474: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882181.47511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882181.47540: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882181.47582: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882181.47604: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882181.47753: variable '__network_required_facts' from source: role '' defaults 11579 1726882181.47766: variable 'ansible_facts' from source: unknown 11579 1726882181.47858: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11579 1726882181.47861: when evaluation is False, skipping this task 11579 1726882181.47864: _execute() done 11579 1726882181.47866: dumping result to json 11579 1726882181.47869: done dumping result, returning 11579 1726882181.47876: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-f197-7423-000000000188] 11579 1726882181.47885: sending task result for task 12673a56-9f93-f197-7423-000000000188 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11579 1726882181.48022: no more pending results, returning what we have 11579 1726882181.48025: results queue empty 11579 1726882181.48026: checking for any_errors_fatal 11579 1726882181.48027: done checking for any_errors_fatal 11579 1726882181.48028: checking for max_fail_percentage 11579 1726882181.48029: done checking for max_fail_percentage 11579 1726882181.48030: checking to see if all hosts have failed and the running result is not ok 11579 1726882181.48031: done checking to see if all hosts have failed 11579 1726882181.48031: getting the remaining hosts for this loop 11579 1726882181.48033: done getting the remaining hosts for this loop 11579 1726882181.48036: getting the next task for host managed_node1 11579 1726882181.48044: done getting next task for host managed_node1 11579 1726882181.48047: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11579 1726882181.48050: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882181.48068: getting variables 11579 1726882181.48070: in VariableManager get_vars() 11579 1726882181.48111: Calling all_inventory to load vars for managed_node1 11579 1726882181.48113: Calling groups_inventory to load vars for managed_node1 11579 1726882181.48116: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882181.48126: Calling all_plugins_play to load vars for managed_node1 11579 1726882181.48129: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882181.48132: Calling groups_plugins_play to load vars for managed_node1 11579 1726882181.48371: done sending task result for task 12673a56-9f93-f197-7423-000000000188 11579 1726882181.48374: WORKER PROCESS EXITING 11579 1726882181.48398: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882181.48639: done with get_vars() 11579 1726882181.48649: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:29:41 -0400 (0:00:00.045) 0:00:10.195 ****** 11579 1726882181.48745: entering _queue_task() for managed_node1/stat 11579 1726882181.48973: worker is 1 (out of 1 available) 11579 1726882181.48986: exiting _queue_task() for managed_node1/stat 11579 1726882181.49000: done queuing things up, now waiting for results queue to drain 11579 1726882181.49001: waiting for pending results... 11579 1726882181.49412: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 11579 1726882181.49417: in run() - task 12673a56-9f93-f197-7423-00000000018a 11579 1726882181.49420: variable 'ansible_search_path' from source: unknown 11579 1726882181.49422: variable 'ansible_search_path' from source: unknown 11579 1726882181.49454: calling self._execute() 11579 1726882181.49535: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882181.49549: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882181.49564: variable 'omit' from source: magic vars 11579 1726882181.49904: variable 'ansible_distribution_major_version' from source: facts 11579 1726882181.49978: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882181.50104: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882181.50379: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882181.50438: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882181.50477: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882181.50522: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882181.50610: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882181.50646: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882181.50678: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882181.50736: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882181.50807: variable '__network_is_ostree' from source: set_fact 11579 1726882181.50821: Evaluated conditional (not __network_is_ostree is defined): False 11579 1726882181.50844: when evaluation is False, skipping this task 11579 1726882181.50847: _execute() done 11579 1726882181.50850: dumping result to json 11579 1726882181.50852: done dumping result, returning 11579 1726882181.50900: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-f197-7423-00000000018a] 11579 1726882181.50903: sending task result for task 12673a56-9f93-f197-7423-00000000018a skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11579 1726882181.51261: no more pending results, returning what we have 11579 1726882181.51264: results queue empty 11579 1726882181.51265: checking for any_errors_fatal 11579 1726882181.51271: done checking for any_errors_fatal 11579 1726882181.51272: checking for max_fail_percentage 11579 1726882181.51273: done checking for max_fail_percentage 11579 1726882181.51274: checking to see if all hosts have failed and the running result is not ok 11579 1726882181.51275: done checking to see if all hosts have failed 11579 1726882181.51275: getting the remaining hosts for this loop 11579 1726882181.51277: done getting the remaining hosts for this loop 11579 1726882181.51281: getting the next task for host managed_node1 11579 1726882181.51286: done getting next task for host managed_node1 11579 1726882181.51289: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11579 1726882181.51298: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882181.51310: getting variables 11579 1726882181.51311: in VariableManager get_vars() 11579 1726882181.51344: Calling all_inventory to load vars for managed_node1 11579 1726882181.51347: Calling groups_inventory to load vars for managed_node1 11579 1726882181.51349: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882181.51358: Calling all_plugins_play to load vars for managed_node1 11579 1726882181.51361: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882181.51364: Calling groups_plugins_play to load vars for managed_node1 11579 1726882181.51520: done sending task result for task 12673a56-9f93-f197-7423-00000000018a 11579 1726882181.51523: WORKER PROCESS EXITING 11579 1726882181.51542: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882181.51746: done with get_vars() 11579 1726882181.51757: done getting variables 11579 1726882181.51814: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:29:41 -0400 (0:00:00.031) 0:00:10.226 ****** 11579 1726882181.51847: entering _queue_task() for managed_node1/set_fact 11579 1726882181.52120: worker is 1 (out of 1 available) 11579 1726882181.52132: exiting _queue_task() for managed_node1/set_fact 11579 1726882181.52142: done queuing things up, now waiting for results queue to drain 11579 1726882181.52143: waiting for pending results... 11579 1726882181.52401: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11579 1726882181.52539: in run() - task 12673a56-9f93-f197-7423-00000000018b 11579 1726882181.52559: variable 'ansible_search_path' from source: unknown 11579 1726882181.52567: variable 'ansible_search_path' from source: unknown 11579 1726882181.52610: calling self._execute() 11579 1726882181.52696: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882181.52710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882181.52733: variable 'omit' from source: magic vars 11579 1726882181.53087: variable 'ansible_distribution_major_version' from source: facts 11579 1726882181.53168: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882181.53274: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882181.53628: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882181.53673: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882181.53715: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882181.53753: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882181.53844: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882181.54000: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882181.54004: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882181.54007: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882181.54025: variable '__network_is_ostree' from source: set_fact 11579 1726882181.54036: Evaluated conditional (not __network_is_ostree is defined): False 11579 1726882181.54043: when evaluation is False, skipping this task 11579 1726882181.54049: _execute() done 11579 1726882181.54054: dumping result to json 11579 1726882181.54061: done dumping result, returning 11579 1726882181.54073: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-f197-7423-00000000018b] 11579 1726882181.54081: sending task result for task 12673a56-9f93-f197-7423-00000000018b 11579 1726882181.54303: done sending task result for task 12673a56-9f93-f197-7423-00000000018b 11579 1726882181.54307: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11579 1726882181.54354: no more pending results, returning what we have 11579 1726882181.54357: results queue empty 11579 1726882181.54358: checking for any_errors_fatal 11579 1726882181.54362: done checking for any_errors_fatal 11579 1726882181.54363: checking for max_fail_percentage 11579 1726882181.54365: done checking for max_fail_percentage 11579 1726882181.54366: checking to see if all hosts have failed and the running result is not ok 11579 1726882181.54367: done checking to see if all hosts have failed 11579 1726882181.54368: getting the remaining hosts for this loop 11579 1726882181.54369: done getting the remaining hosts for this loop 11579 1726882181.54373: getting the next task for host managed_node1 11579 1726882181.54382: done getting next task for host managed_node1 11579 1726882181.54386: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11579 1726882181.54390: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882181.54408: getting variables 11579 1726882181.54410: in VariableManager get_vars() 11579 1726882181.54455: Calling all_inventory to load vars for managed_node1 11579 1726882181.54458: Calling groups_inventory to load vars for managed_node1 11579 1726882181.54460: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882181.54472: Calling all_plugins_play to load vars for managed_node1 11579 1726882181.54475: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882181.54478: Calling groups_plugins_play to load vars for managed_node1 11579 1726882181.54888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882181.55067: done with get_vars() 11579 1726882181.55077: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:29:41 -0400 (0:00:00.033) 0:00:10.260 ****** 11579 1726882181.55170: entering _queue_task() for managed_node1/service_facts 11579 1726882181.55172: Creating lock for service_facts 11579 1726882181.55725: worker is 1 (out of 1 available) 11579 1726882181.55733: exiting _queue_task() for managed_node1/service_facts 11579 1726882181.55742: done queuing things up, now waiting for results queue to drain 11579 1726882181.55743: waiting for pending results... 11579 1726882181.55799: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 11579 1726882181.55943: in run() - task 12673a56-9f93-f197-7423-00000000018d 11579 1726882181.55969: variable 'ansible_search_path' from source: unknown 11579 1726882181.55978: variable 'ansible_search_path' from source: unknown 11579 1726882181.56027: calling self._execute() 11579 1726882181.56120: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882181.56133: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882181.56148: variable 'omit' from source: magic vars 11579 1726882181.56521: variable 'ansible_distribution_major_version' from source: facts 11579 1726882181.56539: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882181.56550: variable 'omit' from source: magic vars 11579 1726882181.56629: variable 'omit' from source: magic vars 11579 1726882181.56667: variable 'omit' from source: magic vars 11579 1726882181.56714: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882181.56758: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882181.56783: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882181.56810: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882181.56827: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882181.56867: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882181.56876: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882181.56884: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882181.56996: Set connection var ansible_timeout to 10 11579 1726882181.57054: Set connection var ansible_shell_type to sh 11579 1726882181.57057: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882181.57060: Set connection var ansible_shell_executable to /bin/sh 11579 1726882181.57063: Set connection var ansible_pipelining to False 11579 1726882181.57065: Set connection var ansible_connection to ssh 11579 1726882181.57068: variable 'ansible_shell_executable' from source: unknown 11579 1726882181.57076: variable 'ansible_connection' from source: unknown 11579 1726882181.57083: variable 'ansible_module_compression' from source: unknown 11579 1726882181.57090: variable 'ansible_shell_type' from source: unknown 11579 1726882181.57102: variable 'ansible_shell_executable' from source: unknown 11579 1726882181.57109: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882181.57117: variable 'ansible_pipelining' from source: unknown 11579 1726882181.57124: variable 'ansible_timeout' from source: unknown 11579 1726882181.57164: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882181.57340: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11579 1726882181.57356: variable 'omit' from source: magic vars 11579 1726882181.57366: starting attempt loop 11579 1726882181.57373: running the handler 11579 1726882181.57402: _low_level_execute_command(): starting 11579 1726882181.57473: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882181.58153: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882181.58252: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882181.58289: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882181.58306: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882181.58365: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882181.60003: stdout chunk (state=3): >>>/root <<< 11579 1726882181.60139: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882181.60151: stdout chunk (state=3): >>><<< 11579 1726882181.60163: stderr chunk (state=3): >>><<< 11579 1726882181.60185: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882181.60210: _low_level_execute_command(): starting 11579 1726882181.60221: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882181.6019151-12114-78078195645297 `" && echo ansible-tmp-1726882181.6019151-12114-78078195645297="` echo /root/.ansible/tmp/ansible-tmp-1726882181.6019151-12114-78078195645297 `" ) && sleep 0' 11579 1726882181.60899: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882181.60902: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 11579 1726882181.60905: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882181.60907: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882181.60928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882181.60970: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882181.60991: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882181.61063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882181.62913: stdout chunk (state=3): >>>ansible-tmp-1726882181.6019151-12114-78078195645297=/root/.ansible/tmp/ansible-tmp-1726882181.6019151-12114-78078195645297 <<< 11579 1726882181.63026: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882181.63088: stderr chunk (state=3): >>><<< 11579 1726882181.63091: stdout chunk (state=3): >>><<< 11579 1726882181.63300: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882181.6019151-12114-78078195645297=/root/.ansible/tmp/ansible-tmp-1726882181.6019151-12114-78078195645297 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882181.63304: variable 'ansible_module_compression' from source: unknown 11579 1726882181.63306: ANSIBALLZ: Using lock for service_facts 11579 1726882181.63308: ANSIBALLZ: Acquiring lock 11579 1726882181.63310: ANSIBALLZ: Lock acquired: 139873762140208 11579 1726882181.63312: ANSIBALLZ: Creating module 11579 1726882181.72610: ANSIBALLZ: Writing module into payload 11579 1726882181.72677: ANSIBALLZ: Writing module 11579 1726882181.72700: ANSIBALLZ: Renaming module 11579 1726882181.72710: ANSIBALLZ: Done creating module 11579 1726882181.72727: variable 'ansible_facts' from source: unknown 11579 1726882181.72779: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882181.6019151-12114-78078195645297/AnsiballZ_service_facts.py 11579 1726882181.72898: Sending initial data 11579 1726882181.72902: Sent initial data (161 bytes) 11579 1726882181.73353: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882181.73358: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882181.73361: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 11579 1726882181.73363: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882181.73365: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882181.73417: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882181.73420: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882181.73422: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882181.73478: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882181.75076: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11579 1726882181.75082: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882181.75116: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882181.75159: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpz6yy09vz /root/.ansible/tmp/ansible-tmp-1726882181.6019151-12114-78078195645297/AnsiballZ_service_facts.py <<< 11579 1726882181.75165: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882181.6019151-12114-78078195645297/AnsiballZ_service_facts.py" <<< 11579 1726882181.75199: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpz6yy09vz" to remote "/root/.ansible/tmp/ansible-tmp-1726882181.6019151-12114-78078195645297/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882181.6019151-12114-78078195645297/AnsiballZ_service_facts.py" <<< 11579 1726882181.75745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882181.75787: stderr chunk (state=3): >>><<< 11579 1726882181.75791: stdout chunk (state=3): >>><<< 11579 1726882181.75812: done transferring module to remote 11579 1726882181.75821: _low_level_execute_command(): starting 11579 1726882181.75826: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882181.6019151-12114-78078195645297/ /root/.ansible/tmp/ansible-tmp-1726882181.6019151-12114-78078195645297/AnsiballZ_service_facts.py && sleep 0' 11579 1726882181.76256: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882181.76260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 11579 1726882181.76262: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882181.76264: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882181.76269: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882181.76323: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882181.76328: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882181.76368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882181.78097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882181.78120: stderr chunk (state=3): >>><<< 11579 1726882181.78123: stdout chunk (state=3): >>><<< 11579 1726882181.78140: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882181.78143: _low_level_execute_command(): starting 11579 1726882181.78147: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882181.6019151-12114-78078195645297/AnsiballZ_service_facts.py && sleep 0' 11579 1726882181.78557: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882181.78598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 11579 1726882181.78601: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882181.78604: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882181.78606: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 11579 1726882181.78608: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882181.78651: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882181.78658: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882181.78660: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882181.78708: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882183.30757: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"<<< 11579 1726882183.30839: stdout chunk (state=3): >>>}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state"<<< 11579 1726882183.30848: stdout chunk (state=3): >>>: "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11579 1726882183.32260: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882183.32500: stderr chunk (state=3): >>><<< 11579 1726882183.32504: stdout chunk (state=3): >>><<< 11579 1726882183.32508: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882183.33997: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882181.6019151-12114-78078195645297/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882183.34004: _low_level_execute_command(): starting 11579 1726882183.34009: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882181.6019151-12114-78078195645297/ > /dev/null 2>&1 && sleep 0' 11579 1726882183.34707: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882183.34760: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882183.34797: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882183.36645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882183.36670: stderr chunk (state=3): >>><<< 11579 1726882183.36673: stdout chunk (state=3): >>><<< 11579 1726882183.36687: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882183.36898: handler run complete 11579 1726882183.36911: variable 'ansible_facts' from source: unknown 11579 1726882183.37075: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882183.37585: variable 'ansible_facts' from source: unknown 11579 1726882183.37724: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882183.37941: attempt loop complete, returning result 11579 1726882183.37953: _execute() done 11579 1726882183.37960: dumping result to json 11579 1726882183.38034: done dumping result, returning 11579 1726882183.38048: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-f197-7423-00000000018d] 11579 1726882183.38057: sending task result for task 12673a56-9f93-f197-7423-00000000018d 11579 1726882183.39100: done sending task result for task 12673a56-9f93-f197-7423-00000000018d 11579 1726882183.39103: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11579 1726882183.39149: no more pending results, returning what we have 11579 1726882183.39151: results queue empty 11579 1726882183.39152: checking for any_errors_fatal 11579 1726882183.39156: done checking for any_errors_fatal 11579 1726882183.39156: checking for max_fail_percentage 11579 1726882183.39158: done checking for max_fail_percentage 11579 1726882183.39158: checking to see if all hosts have failed and the running result is not ok 11579 1726882183.39159: done checking to see if all hosts have failed 11579 1726882183.39160: getting the remaining hosts for this loop 11579 1726882183.39161: done getting the remaining hosts for this loop 11579 1726882183.39164: getting the next task for host managed_node1 11579 1726882183.39169: done getting next task for host managed_node1 11579 1726882183.39172: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11579 1726882183.39176: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882183.39184: getting variables 11579 1726882183.39185: in VariableManager get_vars() 11579 1726882183.39336: Calling all_inventory to load vars for managed_node1 11579 1726882183.39339: Calling groups_inventory to load vars for managed_node1 11579 1726882183.39341: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882183.39349: Calling all_plugins_play to load vars for managed_node1 11579 1726882183.39351: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882183.39354: Calling groups_plugins_play to load vars for managed_node1 11579 1726882183.39904: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882183.40384: done with get_vars() 11579 1726882183.40403: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:29:43 -0400 (0:00:01.853) 0:00:12.113 ****** 11579 1726882183.40499: entering _queue_task() for managed_node1/package_facts 11579 1726882183.40501: Creating lock for package_facts 11579 1726882183.40810: worker is 1 (out of 1 available) 11579 1726882183.40822: exiting _queue_task() for managed_node1/package_facts 11579 1726882183.41000: done queuing things up, now waiting for results queue to drain 11579 1726882183.41002: waiting for pending results... 11579 1726882183.41132: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 11579 1726882183.41339: in run() - task 12673a56-9f93-f197-7423-00000000018e 11579 1726882183.41343: variable 'ansible_search_path' from source: unknown 11579 1726882183.41345: variable 'ansible_search_path' from source: unknown 11579 1726882183.41348: calling self._execute() 11579 1726882183.41413: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882183.41425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882183.41553: variable 'omit' from source: magic vars 11579 1726882183.41979: variable 'ansible_distribution_major_version' from source: facts 11579 1726882183.41999: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882183.42011: variable 'omit' from source: magic vars 11579 1726882183.42082: variable 'omit' from source: magic vars 11579 1726882183.42124: variable 'omit' from source: magic vars 11579 1726882183.42165: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882183.42207: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882183.42271: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882183.42274: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882183.42277: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882183.42336: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882183.42339: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882183.42341: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882183.42406: Set connection var ansible_timeout to 10 11579 1726882183.42417: Set connection var ansible_shell_type to sh 11579 1726882183.42430: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882183.42444: Set connection var ansible_shell_executable to /bin/sh 11579 1726882183.42456: Set connection var ansible_pipelining to False 11579 1726882183.42464: Set connection var ansible_connection to ssh 11579 1726882183.42486: variable 'ansible_shell_executable' from source: unknown 11579 1726882183.42495: variable 'ansible_connection' from source: unknown 11579 1726882183.42504: variable 'ansible_module_compression' from source: unknown 11579 1726882183.42553: variable 'ansible_shell_type' from source: unknown 11579 1726882183.42556: variable 'ansible_shell_executable' from source: unknown 11579 1726882183.42558: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882183.42559: variable 'ansible_pipelining' from source: unknown 11579 1726882183.42561: variable 'ansible_timeout' from source: unknown 11579 1726882183.42563: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882183.42731: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11579 1726882183.42746: variable 'omit' from source: magic vars 11579 1726882183.42754: starting attempt loop 11579 1726882183.42760: running the handler 11579 1726882183.42783: _low_level_execute_command(): starting 11579 1726882183.42880: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882183.43514: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882183.43543: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882183.43611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882183.43662: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882183.43680: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882183.43705: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882183.43779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882183.45517: stdout chunk (state=3): >>>/root <<< 11579 1726882183.45920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882183.45924: stdout chunk (state=3): >>><<< 11579 1726882183.45927: stderr chunk (state=3): >>><<< 11579 1726882183.45956: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882183.45980: _low_level_execute_command(): starting 11579 1726882183.46003: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882183.4596531-12187-94391197568430 `" && echo ansible-tmp-1726882183.4596531-12187-94391197568430="` echo /root/.ansible/tmp/ansible-tmp-1726882183.4596531-12187-94391197568430 `" ) && sleep 0' 11579 1726882183.47336: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882183.47384: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882183.47492: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882183.47600: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882183.48009: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882183.49642: stdout chunk (state=3): >>>ansible-tmp-1726882183.4596531-12187-94391197568430=/root/.ansible/tmp/ansible-tmp-1726882183.4596531-12187-94391197568430 <<< 11579 1726882183.49736: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882183.49775: stderr chunk (state=3): >>><<< 11579 1726882183.49785: stdout chunk (state=3): >>><<< 11579 1726882183.49813: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882183.4596531-12187-94391197568430=/root/.ansible/tmp/ansible-tmp-1726882183.4596531-12187-94391197568430 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882183.49864: variable 'ansible_module_compression' from source: unknown 11579 1726882183.49918: ANSIBALLZ: Using lock for package_facts 11579 1726882183.49925: ANSIBALLZ: Acquiring lock 11579 1726882183.49932: ANSIBALLZ: Lock acquired: 139873761205936 11579 1726882183.49938: ANSIBALLZ: Creating module 11579 1726882183.74542: ANSIBALLZ: Writing module into payload 11579 1726882183.74630: ANSIBALLZ: Writing module 11579 1726882183.74650: ANSIBALLZ: Renaming module 11579 1726882183.74664: ANSIBALLZ: Done creating module 11579 1726882183.74685: variable 'ansible_facts' from source: unknown 11579 1726882183.74801: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882183.4596531-12187-94391197568430/AnsiballZ_package_facts.py 11579 1726882183.74911: Sending initial data 11579 1726882183.74914: Sent initial data (161 bytes) 11579 1726882183.75344: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882183.75347: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882183.75350: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882183.75353: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 11579 1726882183.75355: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882183.75390: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882183.75395: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882183.75465: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882183.77004: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11579 1726882183.77007: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882183.77046: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882183.77089: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpii1b1fsz /root/.ansible/tmp/ansible-tmp-1726882183.4596531-12187-94391197568430/AnsiballZ_package_facts.py <<< 11579 1726882183.77091: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882183.4596531-12187-94391197568430/AnsiballZ_package_facts.py" <<< 11579 1726882183.77128: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpii1b1fsz" to remote "/root/.ansible/tmp/ansible-tmp-1726882183.4596531-12187-94391197568430/AnsiballZ_package_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882183.4596531-12187-94391197568430/AnsiballZ_package_facts.py" <<< 11579 1726882183.78157: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882183.78189: stderr chunk (state=3): >>><<< 11579 1726882183.78192: stdout chunk (state=3): >>><<< 11579 1726882183.78214: done transferring module to remote 11579 1726882183.78223: _low_level_execute_command(): starting 11579 1726882183.78227: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882183.4596531-12187-94391197568430/ /root/.ansible/tmp/ansible-tmp-1726882183.4596531-12187-94391197568430/AnsiballZ_package_facts.py && sleep 0' 11579 1726882183.78643: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882183.78646: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 11579 1726882183.78648: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882183.78651: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882183.78653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882183.78703: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882183.78718: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882183.78753: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882183.80457: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882183.80476: stderr chunk (state=3): >>><<< 11579 1726882183.80480: stdout chunk (state=3): >>><<< 11579 1726882183.80489: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882183.80496: _low_level_execute_command(): starting 11579 1726882183.80499: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882183.4596531-12187-94391197568430/AnsiballZ_package_facts.py && sleep 0' 11579 1726882183.80900: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882183.80903: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882183.80906: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 11579 1726882183.80908: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882183.80910: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882183.80950: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882183.80969: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882183.81013: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882184.24690: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks"<<< 11579 1726882184.24707: stdout chunk (state=3): >>>: [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "rele<<< 11579 1726882184.24734: stdout chunk (state=3): >>>ase": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certm<<< 11579 1726882184.24764: stdout chunk (state=3): >>>ap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "sou<<< 11579 1726882184.24770: stdout chunk (state=3): >>>rce": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arc<<< 11579 1726882184.24795: stdout chunk (state=3): >>>h": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.7<<< 11579 1726882184.24804: stdout chunk (state=3): >>>3.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"<<< 11579 1726882184.24849: stdout chunk (state=3): >>>}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "r<<< 11579 1726882184.24859: stdout chunk (state=3): >>>pm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1<<< 11579 1726882184.24867: stdout chunk (state=3): >>>.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11579 1726882184.24891: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.<<< 11579 1726882184.24905: stdout chunk (state=3): >>>26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "c<<< 11579 1726882184.24915: stdout chunk (state=3): >>>loud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11579 1726882184.26625: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882184.26656: stderr chunk (state=3): >>><<< 11579 1726882184.26659: stdout chunk (state=3): >>><<< 11579 1726882184.26697: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882184.28107: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882183.4596531-12187-94391197568430/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882184.28125: _low_level_execute_command(): starting 11579 1726882184.28130: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882183.4596531-12187-94391197568430/ > /dev/null 2>&1 && sleep 0' 11579 1726882184.28556: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882184.28589: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882184.28592: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 11579 1726882184.28598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882184.28601: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882184.28603: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882184.28604: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882184.28653: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882184.28656: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882184.28662: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882184.28706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882184.30502: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882184.30527: stderr chunk (state=3): >>><<< 11579 1726882184.30530: stdout chunk (state=3): >>><<< 11579 1726882184.30543: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882184.30551: handler run complete 11579 1726882184.30991: variable 'ansible_facts' from source: unknown 11579 1726882184.31231: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882184.32253: variable 'ansible_facts' from source: unknown 11579 1726882184.32479: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882184.32851: attempt loop complete, returning result 11579 1726882184.32860: _execute() done 11579 1726882184.32863: dumping result to json 11579 1726882184.32978: done dumping result, returning 11579 1726882184.32985: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-f197-7423-00000000018e] 11579 1726882184.32988: sending task result for task 12673a56-9f93-f197-7423-00000000018e 11579 1726882184.34244: done sending task result for task 12673a56-9f93-f197-7423-00000000018e 11579 1726882184.34247: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11579 1726882184.34284: no more pending results, returning what we have 11579 1726882184.34287: results queue empty 11579 1726882184.34288: checking for any_errors_fatal 11579 1726882184.34290: done checking for any_errors_fatal 11579 1726882184.34291: checking for max_fail_percentage 11579 1726882184.34292: done checking for max_fail_percentage 11579 1726882184.34292: checking to see if all hosts have failed and the running result is not ok 11579 1726882184.34297: done checking to see if all hosts have failed 11579 1726882184.34298: getting the remaining hosts for this loop 11579 1726882184.34299: done getting the remaining hosts for this loop 11579 1726882184.34301: getting the next task for host managed_node1 11579 1726882184.34306: done getting next task for host managed_node1 11579 1726882184.34308: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11579 1726882184.34310: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882184.34317: getting variables 11579 1726882184.34318: in VariableManager get_vars() 11579 1726882184.34343: Calling all_inventory to load vars for managed_node1 11579 1726882184.34345: Calling groups_inventory to load vars for managed_node1 11579 1726882184.34346: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882184.34352: Calling all_plugins_play to load vars for managed_node1 11579 1726882184.34354: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882184.34356: Calling groups_plugins_play to load vars for managed_node1 11579 1726882184.35047: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882184.36080: done with get_vars() 11579 1726882184.36101: done getting variables 11579 1726882184.36161: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:29:44 -0400 (0:00:00.956) 0:00:13.070 ****** 11579 1726882184.36201: entering _queue_task() for managed_node1/debug 11579 1726882184.36482: worker is 1 (out of 1 available) 11579 1726882184.36696: exiting _queue_task() for managed_node1/debug 11579 1726882184.36710: done queuing things up, now waiting for results queue to drain 11579 1726882184.36712: waiting for pending results... 11579 1726882184.36837: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 11579 1726882184.36889: in run() - task 12673a56-9f93-f197-7423-000000000027 11579 1726882184.36913: variable 'ansible_search_path' from source: unknown 11579 1726882184.36922: variable 'ansible_search_path' from source: unknown 11579 1726882184.36966: calling self._execute() 11579 1726882184.37054: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882184.37066: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882184.37082: variable 'omit' from source: magic vars 11579 1726882184.37404: variable 'ansible_distribution_major_version' from source: facts 11579 1726882184.37408: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882184.37412: variable 'omit' from source: magic vars 11579 1726882184.37450: variable 'omit' from source: magic vars 11579 1726882184.37520: variable 'network_provider' from source: set_fact 11579 1726882184.37532: variable 'omit' from source: magic vars 11579 1726882184.37565: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882184.37591: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882184.37609: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882184.37623: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882184.37632: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882184.37657: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882184.37660: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882184.37663: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882184.37731: Set connection var ansible_timeout to 10 11579 1726882184.37734: Set connection var ansible_shell_type to sh 11579 1726882184.37742: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882184.37746: Set connection var ansible_shell_executable to /bin/sh 11579 1726882184.37753: Set connection var ansible_pipelining to False 11579 1726882184.37756: Set connection var ansible_connection to ssh 11579 1726882184.37773: variable 'ansible_shell_executable' from source: unknown 11579 1726882184.37776: variable 'ansible_connection' from source: unknown 11579 1726882184.37779: variable 'ansible_module_compression' from source: unknown 11579 1726882184.37781: variable 'ansible_shell_type' from source: unknown 11579 1726882184.37783: variable 'ansible_shell_executable' from source: unknown 11579 1726882184.37785: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882184.37787: variable 'ansible_pipelining' from source: unknown 11579 1726882184.37791: variable 'ansible_timeout' from source: unknown 11579 1726882184.37797: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882184.37891: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882184.37899: variable 'omit' from source: magic vars 11579 1726882184.37905: starting attempt loop 11579 1726882184.37908: running the handler 11579 1726882184.37941: handler run complete 11579 1726882184.37951: attempt loop complete, returning result 11579 1726882184.37953: _execute() done 11579 1726882184.37956: dumping result to json 11579 1726882184.37958: done dumping result, returning 11579 1726882184.37965: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-f197-7423-000000000027] 11579 1726882184.37970: sending task result for task 12673a56-9f93-f197-7423-000000000027 11579 1726882184.38048: done sending task result for task 12673a56-9f93-f197-7423-000000000027 11579 1726882184.38051: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 11579 1726882184.38109: no more pending results, returning what we have 11579 1726882184.38112: results queue empty 11579 1726882184.38112: checking for any_errors_fatal 11579 1726882184.38121: done checking for any_errors_fatal 11579 1726882184.38122: checking for max_fail_percentage 11579 1726882184.38123: done checking for max_fail_percentage 11579 1726882184.38124: checking to see if all hosts have failed and the running result is not ok 11579 1726882184.38125: done checking to see if all hosts have failed 11579 1726882184.38125: getting the remaining hosts for this loop 11579 1726882184.38127: done getting the remaining hosts for this loop 11579 1726882184.38130: getting the next task for host managed_node1 11579 1726882184.38136: done getting next task for host managed_node1 11579 1726882184.38140: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11579 1726882184.38142: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882184.38151: getting variables 11579 1726882184.38152: in VariableManager get_vars() 11579 1726882184.38184: Calling all_inventory to load vars for managed_node1 11579 1726882184.38186: Calling groups_inventory to load vars for managed_node1 11579 1726882184.38188: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882184.38200: Calling all_plugins_play to load vars for managed_node1 11579 1726882184.38202: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882184.38205: Calling groups_plugins_play to load vars for managed_node1 11579 1726882184.39027: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882184.40529: done with get_vars() 11579 1726882184.40550: done getting variables 11579 1726882184.40638: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:29:44 -0400 (0:00:00.044) 0:00:13.115 ****** 11579 1726882184.40667: entering _queue_task() for managed_node1/fail 11579 1726882184.40669: Creating lock for fail 11579 1726882184.40946: worker is 1 (out of 1 available) 11579 1726882184.40962: exiting _queue_task() for managed_node1/fail 11579 1726882184.40975: done queuing things up, now waiting for results queue to drain 11579 1726882184.40976: waiting for pending results... 11579 1726882184.41321: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11579 1726882184.41398: in run() - task 12673a56-9f93-f197-7423-000000000028 11579 1726882184.41600: variable 'ansible_search_path' from source: unknown 11579 1726882184.41604: variable 'ansible_search_path' from source: unknown 11579 1726882184.41607: calling self._execute() 11579 1726882184.41609: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882184.41612: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882184.41614: variable 'omit' from source: magic vars 11579 1726882184.41959: variable 'ansible_distribution_major_version' from source: facts 11579 1726882184.41976: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882184.42105: variable 'network_state' from source: role '' defaults 11579 1726882184.42120: Evaluated conditional (network_state != {}): False 11579 1726882184.42129: when evaluation is False, skipping this task 11579 1726882184.42136: _execute() done 11579 1726882184.42143: dumping result to json 11579 1726882184.42151: done dumping result, returning 11579 1726882184.42167: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-f197-7423-000000000028] 11579 1726882184.42179: sending task result for task 12673a56-9f93-f197-7423-000000000028 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11579 1726882184.42327: no more pending results, returning what we have 11579 1726882184.42331: results queue empty 11579 1726882184.42332: checking for any_errors_fatal 11579 1726882184.42337: done checking for any_errors_fatal 11579 1726882184.42338: checking for max_fail_percentage 11579 1726882184.42340: done checking for max_fail_percentage 11579 1726882184.42341: checking to see if all hosts have failed and the running result is not ok 11579 1726882184.42342: done checking to see if all hosts have failed 11579 1726882184.42343: getting the remaining hosts for this loop 11579 1726882184.42345: done getting the remaining hosts for this loop 11579 1726882184.42348: getting the next task for host managed_node1 11579 1726882184.42356: done getting next task for host managed_node1 11579 1726882184.42359: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11579 1726882184.42362: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882184.42378: getting variables 11579 1726882184.42380: in VariableManager get_vars() 11579 1726882184.42425: Calling all_inventory to load vars for managed_node1 11579 1726882184.42429: Calling groups_inventory to load vars for managed_node1 11579 1726882184.42432: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882184.42444: Calling all_plugins_play to load vars for managed_node1 11579 1726882184.42448: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882184.42451: Calling groups_plugins_play to load vars for managed_node1 11579 1726882184.43208: done sending task result for task 12673a56-9f93-f197-7423-000000000028 11579 1726882184.43211: WORKER PROCESS EXITING 11579 1726882184.43932: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882184.45590: done with get_vars() 11579 1726882184.45613: done getting variables 11579 1726882184.45668: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:29:44 -0400 (0:00:00.050) 0:00:13.165 ****** 11579 1726882184.45704: entering _queue_task() for managed_node1/fail 11579 1726882184.45989: worker is 1 (out of 1 available) 11579 1726882184.46003: exiting _queue_task() for managed_node1/fail 11579 1726882184.46015: done queuing things up, now waiting for results queue to drain 11579 1726882184.46016: waiting for pending results... 11579 1726882184.46280: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11579 1726882184.46416: in run() - task 12673a56-9f93-f197-7423-000000000029 11579 1726882184.46438: variable 'ansible_search_path' from source: unknown 11579 1726882184.46447: variable 'ansible_search_path' from source: unknown 11579 1726882184.46486: calling self._execute() 11579 1726882184.46573: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882184.46585: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882184.46604: variable 'omit' from source: magic vars 11579 1726882184.46963: variable 'ansible_distribution_major_version' from source: facts 11579 1726882184.46979: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882184.47110: variable 'network_state' from source: role '' defaults 11579 1726882184.47125: Evaluated conditional (network_state != {}): False 11579 1726882184.47133: when evaluation is False, skipping this task 11579 1726882184.47140: _execute() done 11579 1726882184.47148: dumping result to json 11579 1726882184.47158: done dumping result, returning 11579 1726882184.47171: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-f197-7423-000000000029] 11579 1726882184.47182: sending task result for task 12673a56-9f93-f197-7423-000000000029 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11579 1726882184.47330: no more pending results, returning what we have 11579 1726882184.47334: results queue empty 11579 1726882184.47335: checking for any_errors_fatal 11579 1726882184.47343: done checking for any_errors_fatal 11579 1726882184.47344: checking for max_fail_percentage 11579 1726882184.47346: done checking for max_fail_percentage 11579 1726882184.47347: checking to see if all hosts have failed and the running result is not ok 11579 1726882184.47348: done checking to see if all hosts have failed 11579 1726882184.47349: getting the remaining hosts for this loop 11579 1726882184.47351: done getting the remaining hosts for this loop 11579 1726882184.47354: getting the next task for host managed_node1 11579 1726882184.47361: done getting next task for host managed_node1 11579 1726882184.47365: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11579 1726882184.47368: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882184.47382: getting variables 11579 1726882184.47384: in VariableManager get_vars() 11579 1726882184.47426: Calling all_inventory to load vars for managed_node1 11579 1726882184.47428: Calling groups_inventory to load vars for managed_node1 11579 1726882184.47431: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882184.47441: Calling all_plugins_play to load vars for managed_node1 11579 1726882184.47444: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882184.47446: Calling groups_plugins_play to load vars for managed_node1 11579 1726882184.48209: done sending task result for task 12673a56-9f93-f197-7423-000000000029 11579 1726882184.48212: WORKER PROCESS EXITING 11579 1726882184.48991: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882184.51021: done with get_vars() 11579 1726882184.51045: done getting variables 11579 1726882184.51121: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:29:44 -0400 (0:00:00.054) 0:00:13.219 ****** 11579 1726882184.51155: entering _queue_task() for managed_node1/fail 11579 1726882184.51490: worker is 1 (out of 1 available) 11579 1726882184.51509: exiting _queue_task() for managed_node1/fail 11579 1726882184.51521: done queuing things up, now waiting for results queue to drain 11579 1726882184.51523: waiting for pending results... 11579 1726882184.51765: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11579 1726882184.51913: in run() - task 12673a56-9f93-f197-7423-00000000002a 11579 1726882184.51935: variable 'ansible_search_path' from source: unknown 11579 1726882184.51943: variable 'ansible_search_path' from source: unknown 11579 1726882184.51985: calling self._execute() 11579 1726882184.52088: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882184.52217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882184.52222: variable 'omit' from source: magic vars 11579 1726882184.52849: variable 'ansible_distribution_major_version' from source: facts 11579 1726882184.52892: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882184.53069: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882184.55433: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882184.55474: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882184.55510: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882184.55535: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882184.55555: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882184.55618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.55638: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.55655: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.55681: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.55692: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.55762: variable 'ansible_distribution_major_version' from source: facts 11579 1726882184.55774: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11579 1726882184.55853: variable 'ansible_distribution' from source: facts 11579 1726882184.55857: variable '__network_rh_distros' from source: role '' defaults 11579 1726882184.55864: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11579 1726882184.56018: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.56040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.56055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.56081: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.56091: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.56125: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.56145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.56162: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.56186: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.56199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.56238: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.56263: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.56278: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.56307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.56329: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.56634: variable 'network_connections' from source: task vars 11579 1726882184.56637: variable 'controller_profile' from source: play vars 11579 1726882184.56684: variable 'controller_profile' from source: play vars 11579 1726882184.56700: variable 'controller_device' from source: play vars 11579 1726882184.56751: variable 'controller_device' from source: play vars 11579 1726882184.56764: variable 'port1_profile' from source: play vars 11579 1726882184.56817: variable 'port1_profile' from source: play vars 11579 1726882184.56825: variable 'dhcp_interface1' from source: play vars 11579 1726882184.56880: variable 'dhcp_interface1' from source: play vars 11579 1726882184.56886: variable 'controller_profile' from source: play vars 11579 1726882184.56944: variable 'controller_profile' from source: play vars 11579 1726882184.56950: variable 'port2_profile' from source: play vars 11579 1726882184.57008: variable 'port2_profile' from source: play vars 11579 1726882184.57026: variable 'dhcp_interface2' from source: play vars 11579 1726882184.57072: variable 'dhcp_interface2' from source: play vars 11579 1726882184.57078: variable 'controller_profile' from source: play vars 11579 1726882184.57150: variable 'controller_profile' from source: play vars 11579 1726882184.57158: variable 'network_state' from source: role '' defaults 11579 1726882184.57226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882184.57381: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882184.57417: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882184.57462: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882184.57469: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882184.57511: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882184.57571: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882184.57575: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.57577: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882184.57613: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11579 1726882184.57616: when evaluation is False, skipping this task 11579 1726882184.57619: _execute() done 11579 1726882184.57623: dumping result to json 11579 1726882184.57625: done dumping result, returning 11579 1726882184.57634: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-f197-7423-00000000002a] 11579 1726882184.57637: sending task result for task 12673a56-9f93-f197-7423-00000000002a 11579 1726882184.57739: done sending task result for task 12673a56-9f93-f197-7423-00000000002a 11579 1726882184.57742: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11579 1726882184.57831: no more pending results, returning what we have 11579 1726882184.57834: results queue empty 11579 1726882184.57834: checking for any_errors_fatal 11579 1726882184.57840: done checking for any_errors_fatal 11579 1726882184.57841: checking for max_fail_percentage 11579 1726882184.57842: done checking for max_fail_percentage 11579 1726882184.57843: checking to see if all hosts have failed and the running result is not ok 11579 1726882184.57844: done checking to see if all hosts have failed 11579 1726882184.57844: getting the remaining hosts for this loop 11579 1726882184.57846: done getting the remaining hosts for this loop 11579 1726882184.57849: getting the next task for host managed_node1 11579 1726882184.57854: done getting next task for host managed_node1 11579 1726882184.57857: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11579 1726882184.57860: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882184.57872: getting variables 11579 1726882184.57874: in VariableManager get_vars() 11579 1726882184.58002: Calling all_inventory to load vars for managed_node1 11579 1726882184.58005: Calling groups_inventory to load vars for managed_node1 11579 1726882184.58008: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882184.58016: Calling all_plugins_play to load vars for managed_node1 11579 1726882184.58020: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882184.58024: Calling groups_plugins_play to load vars for managed_node1 11579 1726882184.58896: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882184.59839: done with get_vars() 11579 1726882184.59861: done getting variables 11579 1726882184.59953: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:29:44 -0400 (0:00:00.088) 0:00:13.308 ****** 11579 1726882184.59983: entering _queue_task() for managed_node1/dnf 11579 1726882184.60313: worker is 1 (out of 1 available) 11579 1726882184.60327: exiting _queue_task() for managed_node1/dnf 11579 1726882184.60339: done queuing things up, now waiting for results queue to drain 11579 1726882184.60342: waiting for pending results... 11579 1726882184.60736: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11579 1726882184.60901: in run() - task 12673a56-9f93-f197-7423-00000000002b 11579 1726882184.60905: variable 'ansible_search_path' from source: unknown 11579 1726882184.60908: variable 'ansible_search_path' from source: unknown 11579 1726882184.60910: calling self._execute() 11579 1726882184.60930: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882184.60941: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882184.60955: variable 'omit' from source: magic vars 11579 1726882184.61324: variable 'ansible_distribution_major_version' from source: facts 11579 1726882184.61341: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882184.61553: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882184.63039: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882184.63089: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882184.63116: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882184.63141: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882184.63162: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882184.63300: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.63304: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.63307: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.63343: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.63363: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.63480: variable 'ansible_distribution' from source: facts 11579 1726882184.63491: variable 'ansible_distribution_major_version' from source: facts 11579 1726882184.63528: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11579 1726882184.63646: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882184.63773: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.63900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.63903: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.63906: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.63909: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.63930: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.63958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.63987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.64034: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.64055: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.64102: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.64130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.64159: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.64206: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.64230: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.64359: variable 'network_connections' from source: task vars 11579 1726882184.64369: variable 'controller_profile' from source: play vars 11579 1726882184.64419: variable 'controller_profile' from source: play vars 11579 1726882184.64428: variable 'controller_device' from source: play vars 11579 1726882184.64468: variable 'controller_device' from source: play vars 11579 1726882184.64477: variable 'port1_profile' from source: play vars 11579 1726882184.64528: variable 'port1_profile' from source: play vars 11579 1726882184.64534: variable 'dhcp_interface1' from source: play vars 11579 1726882184.64575: variable 'dhcp_interface1' from source: play vars 11579 1726882184.64581: variable 'controller_profile' from source: play vars 11579 1726882184.64627: variable 'controller_profile' from source: play vars 11579 1726882184.64633: variable 'port2_profile' from source: play vars 11579 1726882184.64673: variable 'port2_profile' from source: play vars 11579 1726882184.64680: variable 'dhcp_interface2' from source: play vars 11579 1726882184.64727: variable 'dhcp_interface2' from source: play vars 11579 1726882184.64731: variable 'controller_profile' from source: play vars 11579 1726882184.64771: variable 'controller_profile' from source: play vars 11579 1726882184.64835: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882184.64944: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882184.64973: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882184.64996: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882184.65020: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882184.65053: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882184.65069: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882184.65086: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.65108: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882184.65151: variable '__network_team_connections_defined' from source: role '' defaults 11579 1726882184.65305: variable 'network_connections' from source: task vars 11579 1726882184.65308: variable 'controller_profile' from source: play vars 11579 1726882184.65349: variable 'controller_profile' from source: play vars 11579 1726882184.65355: variable 'controller_device' from source: play vars 11579 1726882184.65402: variable 'controller_device' from source: play vars 11579 1726882184.65409: variable 'port1_profile' from source: play vars 11579 1726882184.65449: variable 'port1_profile' from source: play vars 11579 1726882184.65455: variable 'dhcp_interface1' from source: play vars 11579 1726882184.65500: variable 'dhcp_interface1' from source: play vars 11579 1726882184.65506: variable 'controller_profile' from source: play vars 11579 1726882184.65547: variable 'controller_profile' from source: play vars 11579 1726882184.65553: variable 'port2_profile' from source: play vars 11579 1726882184.65598: variable 'port2_profile' from source: play vars 11579 1726882184.65601: variable 'dhcp_interface2' from source: play vars 11579 1726882184.65644: variable 'dhcp_interface2' from source: play vars 11579 1726882184.65648: variable 'controller_profile' from source: play vars 11579 1726882184.65688: variable 'controller_profile' from source: play vars 11579 1726882184.65809: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11579 1726882184.65812: when evaluation is False, skipping this task 11579 1726882184.65813: _execute() done 11579 1726882184.65815: dumping result to json 11579 1726882184.65816: done dumping result, returning 11579 1726882184.65819: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-f197-7423-00000000002b] 11579 1726882184.65821: sending task result for task 12673a56-9f93-f197-7423-00000000002b 11579 1726882184.65880: done sending task result for task 12673a56-9f93-f197-7423-00000000002b 11579 1726882184.65882: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11579 1726882184.65954: no more pending results, returning what we have 11579 1726882184.65956: results queue empty 11579 1726882184.65957: checking for any_errors_fatal 11579 1726882184.65961: done checking for any_errors_fatal 11579 1726882184.65962: checking for max_fail_percentage 11579 1726882184.65963: done checking for max_fail_percentage 11579 1726882184.65964: checking to see if all hosts have failed and the running result is not ok 11579 1726882184.65965: done checking to see if all hosts have failed 11579 1726882184.65965: getting the remaining hosts for this loop 11579 1726882184.65966: done getting the remaining hosts for this loop 11579 1726882184.65969: getting the next task for host managed_node1 11579 1726882184.65975: done getting next task for host managed_node1 11579 1726882184.65978: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11579 1726882184.65981: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882184.66002: getting variables 11579 1726882184.66003: in VariableManager get_vars() 11579 1726882184.66041: Calling all_inventory to load vars for managed_node1 11579 1726882184.66044: Calling groups_inventory to load vars for managed_node1 11579 1726882184.66046: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882184.66054: Calling all_plugins_play to load vars for managed_node1 11579 1726882184.66055: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882184.66057: Calling groups_plugins_play to load vars for managed_node1 11579 1726882184.66787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882184.67648: done with get_vars() 11579 1726882184.67664: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11579 1726882184.67718: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:29:44 -0400 (0:00:00.077) 0:00:13.385 ****** 11579 1726882184.67742: entering _queue_task() for managed_node1/yum 11579 1726882184.67743: Creating lock for yum 11579 1726882184.67971: worker is 1 (out of 1 available) 11579 1726882184.67984: exiting _queue_task() for managed_node1/yum 11579 1726882184.67996: done queuing things up, now waiting for results queue to drain 11579 1726882184.67998: waiting for pending results... 11579 1726882184.68161: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11579 1726882184.68240: in run() - task 12673a56-9f93-f197-7423-00000000002c 11579 1726882184.68252: variable 'ansible_search_path' from source: unknown 11579 1726882184.68255: variable 'ansible_search_path' from source: unknown 11579 1726882184.68282: calling self._execute() 11579 1726882184.68347: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882184.68351: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882184.68360: variable 'omit' from source: magic vars 11579 1726882184.68619: variable 'ansible_distribution_major_version' from source: facts 11579 1726882184.68629: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882184.68746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882184.70411: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882184.70451: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882184.70477: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882184.70505: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882184.70527: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882184.70580: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.70602: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.70623: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.70649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.70659: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.70728: variable 'ansible_distribution_major_version' from source: facts 11579 1726882184.70739: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11579 1726882184.70741: when evaluation is False, skipping this task 11579 1726882184.70744: _execute() done 11579 1726882184.70747: dumping result to json 11579 1726882184.70749: done dumping result, returning 11579 1726882184.70755: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-f197-7423-00000000002c] 11579 1726882184.70760: sending task result for task 12673a56-9f93-f197-7423-00000000002c 11579 1726882184.70845: done sending task result for task 12673a56-9f93-f197-7423-00000000002c 11579 1726882184.70848: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11579 1726882184.70899: no more pending results, returning what we have 11579 1726882184.70903: results queue empty 11579 1726882184.70903: checking for any_errors_fatal 11579 1726882184.70907: done checking for any_errors_fatal 11579 1726882184.70908: checking for max_fail_percentage 11579 1726882184.70909: done checking for max_fail_percentage 11579 1726882184.70910: checking to see if all hosts have failed and the running result is not ok 11579 1726882184.70911: done checking to see if all hosts have failed 11579 1726882184.70912: getting the remaining hosts for this loop 11579 1726882184.70913: done getting the remaining hosts for this loop 11579 1726882184.70916: getting the next task for host managed_node1 11579 1726882184.70922: done getting next task for host managed_node1 11579 1726882184.70926: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11579 1726882184.70928: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882184.70942: getting variables 11579 1726882184.70943: in VariableManager get_vars() 11579 1726882184.70980: Calling all_inventory to load vars for managed_node1 11579 1726882184.70983: Calling groups_inventory to load vars for managed_node1 11579 1726882184.70985: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882184.70994: Calling all_plugins_play to load vars for managed_node1 11579 1726882184.70997: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882184.70999: Calling groups_plugins_play to load vars for managed_node1 11579 1726882184.71820: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882184.72663: done with get_vars() 11579 1726882184.72676: done getting variables 11579 1726882184.72721: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:29:44 -0400 (0:00:00.049) 0:00:13.435 ****** 11579 1726882184.72743: entering _queue_task() for managed_node1/fail 11579 1726882184.72948: worker is 1 (out of 1 available) 11579 1726882184.72960: exiting _queue_task() for managed_node1/fail 11579 1726882184.72972: done queuing things up, now waiting for results queue to drain 11579 1726882184.72973: waiting for pending results... 11579 1726882184.73140: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11579 1726882184.73219: in run() - task 12673a56-9f93-f197-7423-00000000002d 11579 1726882184.73230: variable 'ansible_search_path' from source: unknown 11579 1726882184.73234: variable 'ansible_search_path' from source: unknown 11579 1726882184.73261: calling self._execute() 11579 1726882184.73325: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882184.73329: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882184.73338: variable 'omit' from source: magic vars 11579 1726882184.73586: variable 'ansible_distribution_major_version' from source: facts 11579 1726882184.73597: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882184.73677: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882184.73803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882184.75201: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882184.75244: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882184.75269: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882184.75300: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882184.75320: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882184.75382: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.75401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.75419: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.75444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.75455: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.75487: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.75511: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.75529: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.75552: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.75562: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.75590: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.75616: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.75632: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.75656: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.75666: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.75784: variable 'network_connections' from source: task vars 11579 1726882184.75792: variable 'controller_profile' from source: play vars 11579 1726882184.75845: variable 'controller_profile' from source: play vars 11579 1726882184.75853: variable 'controller_device' from source: play vars 11579 1726882184.75895: variable 'controller_device' from source: play vars 11579 1726882184.75905: variable 'port1_profile' from source: play vars 11579 1726882184.75949: variable 'port1_profile' from source: play vars 11579 1726882184.75955: variable 'dhcp_interface1' from source: play vars 11579 1726882184.75997: variable 'dhcp_interface1' from source: play vars 11579 1726882184.76005: variable 'controller_profile' from source: play vars 11579 1726882184.76048: variable 'controller_profile' from source: play vars 11579 1726882184.76054: variable 'port2_profile' from source: play vars 11579 1726882184.76100: variable 'port2_profile' from source: play vars 11579 1726882184.76107: variable 'dhcp_interface2' from source: play vars 11579 1726882184.76149: variable 'dhcp_interface2' from source: play vars 11579 1726882184.76155: variable 'controller_profile' from source: play vars 11579 1726882184.76197: variable 'controller_profile' from source: play vars 11579 1726882184.76242: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882184.76362: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882184.76386: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882184.76412: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882184.76433: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882184.76464: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882184.76481: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882184.76503: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.76520: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882184.76566: variable '__network_team_connections_defined' from source: role '' defaults 11579 1726882184.76715: variable 'network_connections' from source: task vars 11579 1726882184.76719: variable 'controller_profile' from source: play vars 11579 1726882184.76760: variable 'controller_profile' from source: play vars 11579 1726882184.76765: variable 'controller_device' from source: play vars 11579 1726882184.76811: variable 'controller_device' from source: play vars 11579 1726882184.76818: variable 'port1_profile' from source: play vars 11579 1726882184.76860: variable 'port1_profile' from source: play vars 11579 1726882184.76866: variable 'dhcp_interface1' from source: play vars 11579 1726882184.76912: variable 'dhcp_interface1' from source: play vars 11579 1726882184.76917: variable 'controller_profile' from source: play vars 11579 1726882184.76958: variable 'controller_profile' from source: play vars 11579 1726882184.76964: variable 'port2_profile' from source: play vars 11579 1726882184.77007: variable 'port2_profile' from source: play vars 11579 1726882184.77012: variable 'dhcp_interface2' from source: play vars 11579 1726882184.77055: variable 'dhcp_interface2' from source: play vars 11579 1726882184.77060: variable 'controller_profile' from source: play vars 11579 1726882184.77104: variable 'controller_profile' from source: play vars 11579 1726882184.77132: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11579 1726882184.77135: when evaluation is False, skipping this task 11579 1726882184.77138: _execute() done 11579 1726882184.77140: dumping result to json 11579 1726882184.77142: done dumping result, returning 11579 1726882184.77144: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-f197-7423-00000000002d] 11579 1726882184.77149: sending task result for task 12673a56-9f93-f197-7423-00000000002d 11579 1726882184.77226: done sending task result for task 12673a56-9f93-f197-7423-00000000002d 11579 1726882184.77228: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11579 1726882184.77276: no more pending results, returning what we have 11579 1726882184.77279: results queue empty 11579 1726882184.77279: checking for any_errors_fatal 11579 1726882184.77284: done checking for any_errors_fatal 11579 1726882184.77285: checking for max_fail_percentage 11579 1726882184.77287: done checking for max_fail_percentage 11579 1726882184.77287: checking to see if all hosts have failed and the running result is not ok 11579 1726882184.77288: done checking to see if all hosts have failed 11579 1726882184.77289: getting the remaining hosts for this loop 11579 1726882184.77290: done getting the remaining hosts for this loop 11579 1726882184.77296: getting the next task for host managed_node1 11579 1726882184.77301: done getting next task for host managed_node1 11579 1726882184.77305: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11579 1726882184.77307: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882184.77320: getting variables 11579 1726882184.77321: in VariableManager get_vars() 11579 1726882184.77358: Calling all_inventory to load vars for managed_node1 11579 1726882184.77361: Calling groups_inventory to load vars for managed_node1 11579 1726882184.77363: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882184.77371: Calling all_plugins_play to load vars for managed_node1 11579 1726882184.77373: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882184.77375: Calling groups_plugins_play to load vars for managed_node1 11579 1726882184.78115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882184.78966: done with get_vars() 11579 1726882184.78980: done getting variables 11579 1726882184.79022: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:29:44 -0400 (0:00:00.063) 0:00:13.498 ****** 11579 1726882184.79045: entering _queue_task() for managed_node1/package 11579 1726882184.79250: worker is 1 (out of 1 available) 11579 1726882184.79263: exiting _queue_task() for managed_node1/package 11579 1726882184.79273: done queuing things up, now waiting for results queue to drain 11579 1726882184.79275: waiting for pending results... 11579 1726882184.79442: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 11579 1726882184.79518: in run() - task 12673a56-9f93-f197-7423-00000000002e 11579 1726882184.79529: variable 'ansible_search_path' from source: unknown 11579 1726882184.79533: variable 'ansible_search_path' from source: unknown 11579 1726882184.79557: calling self._execute() 11579 1726882184.79625: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882184.79629: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882184.79638: variable 'omit' from source: magic vars 11579 1726882184.79885: variable 'ansible_distribution_major_version' from source: facts 11579 1726882184.79897: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882184.80022: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882184.80198: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882184.80230: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882184.80255: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882184.80278: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882184.80353: variable 'network_packages' from source: role '' defaults 11579 1726882184.80426: variable '__network_provider_setup' from source: role '' defaults 11579 1726882184.80438: variable '__network_service_name_default_nm' from source: role '' defaults 11579 1726882184.80479: variable '__network_service_name_default_nm' from source: role '' defaults 11579 1726882184.80487: variable '__network_packages_default_nm' from source: role '' defaults 11579 1726882184.80599: variable '__network_packages_default_nm' from source: role '' defaults 11579 1726882184.80720: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882184.82435: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882184.82481: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882184.82510: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882184.82536: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882184.82555: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882184.82612: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.82636: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.82651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.82676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.82687: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.82722: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.82739: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.82757: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.82781: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.82791: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.82935: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11579 1726882184.83008: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.83024: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.83040: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.83063: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.83075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.83139: variable 'ansible_python' from source: facts 11579 1726882184.83157: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11579 1726882184.83218: variable '__network_wpa_supplicant_required' from source: role '' defaults 11579 1726882184.83272: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11579 1726882184.83361: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.83377: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.83394: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.83428: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.83438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.83469: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882184.83488: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882184.83515: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.83600: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882184.83603: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882184.83718: variable 'network_connections' from source: task vars 11579 1726882184.83729: variable 'controller_profile' from source: play vars 11579 1726882184.83831: variable 'controller_profile' from source: play vars 11579 1726882184.83845: variable 'controller_device' from source: play vars 11579 1726882184.83947: variable 'controller_device' from source: play vars 11579 1726882184.83963: variable 'port1_profile' from source: play vars 11579 1726882184.84066: variable 'port1_profile' from source: play vars 11579 1726882184.84081: variable 'dhcp_interface1' from source: play vars 11579 1726882184.84200: variable 'dhcp_interface1' from source: play vars 11579 1726882184.84203: variable 'controller_profile' from source: play vars 11579 1726882184.84274: variable 'controller_profile' from source: play vars 11579 1726882184.84287: variable 'port2_profile' from source: play vars 11579 1726882184.84387: variable 'port2_profile' from source: play vars 11579 1726882184.84407: variable 'dhcp_interface2' from source: play vars 11579 1726882184.84600: variable 'dhcp_interface2' from source: play vars 11579 1726882184.84604: variable 'controller_profile' from source: play vars 11579 1726882184.84617: variable 'controller_profile' from source: play vars 11579 1726882184.84684: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882184.84722: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882184.84754: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882184.84787: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882184.84841: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882184.85122: variable 'network_connections' from source: task vars 11579 1726882184.85132: variable 'controller_profile' from source: play vars 11579 1726882184.85233: variable 'controller_profile' from source: play vars 11579 1726882184.85247: variable 'controller_device' from source: play vars 11579 1726882184.85341: variable 'controller_device' from source: play vars 11579 1726882184.85356: variable 'port1_profile' from source: play vars 11579 1726882184.85460: variable 'port1_profile' from source: play vars 11579 1726882184.85475: variable 'dhcp_interface1' from source: play vars 11579 1726882184.85584: variable 'dhcp_interface1' from source: play vars 11579 1726882184.85606: variable 'controller_profile' from source: play vars 11579 1726882184.85712: variable 'controller_profile' from source: play vars 11579 1726882184.85900: variable 'port2_profile' from source: play vars 11579 1726882184.85903: variable 'port2_profile' from source: play vars 11579 1726882184.85905: variable 'dhcp_interface2' from source: play vars 11579 1726882184.85941: variable 'dhcp_interface2' from source: play vars 11579 1726882184.85955: variable 'controller_profile' from source: play vars 11579 1726882184.86060: variable 'controller_profile' from source: play vars 11579 1726882184.86125: variable '__network_packages_default_wireless' from source: role '' defaults 11579 1726882184.86211: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882184.86538: variable 'network_connections' from source: task vars 11579 1726882184.86548: variable 'controller_profile' from source: play vars 11579 1726882184.86613: variable 'controller_profile' from source: play vars 11579 1726882184.86626: variable 'controller_device' from source: play vars 11579 1726882184.86686: variable 'controller_device' from source: play vars 11579 1726882184.86705: variable 'port1_profile' from source: play vars 11579 1726882184.86768: variable 'port1_profile' from source: play vars 11579 1726882184.86780: variable 'dhcp_interface1' from source: play vars 11579 1726882184.86850: variable 'dhcp_interface1' from source: play vars 11579 1726882184.86863: variable 'controller_profile' from source: play vars 11579 1726882184.86930: variable 'controller_profile' from source: play vars 11579 1726882184.86942: variable 'port2_profile' from source: play vars 11579 1726882184.87010: variable 'port2_profile' from source: play vars 11579 1726882184.87023: variable 'dhcp_interface2' from source: play vars 11579 1726882184.87086: variable 'dhcp_interface2' from source: play vars 11579 1726882184.87102: variable 'controller_profile' from source: play vars 11579 1726882184.87165: variable 'controller_profile' from source: play vars 11579 1726882184.87199: variable '__network_packages_default_team' from source: role '' defaults 11579 1726882184.87279: variable '__network_team_connections_defined' from source: role '' defaults 11579 1726882184.87602: variable 'network_connections' from source: task vars 11579 1726882184.87701: variable 'controller_profile' from source: play vars 11579 1726882184.87704: variable 'controller_profile' from source: play vars 11579 1726882184.87706: variable 'controller_device' from source: play vars 11579 1726882184.87761: variable 'controller_device' from source: play vars 11579 1726882184.87773: variable 'port1_profile' from source: play vars 11579 1726882184.87849: variable 'port1_profile' from source: play vars 11579 1726882184.87861: variable 'dhcp_interface1' from source: play vars 11579 1726882184.87935: variable 'dhcp_interface1' from source: play vars 11579 1726882184.87947: variable 'controller_profile' from source: play vars 11579 1726882184.88014: variable 'controller_profile' from source: play vars 11579 1726882184.88026: variable 'port2_profile' from source: play vars 11579 1726882184.88089: variable 'port2_profile' from source: play vars 11579 1726882184.88147: variable 'dhcp_interface2' from source: play vars 11579 1726882184.88172: variable 'dhcp_interface2' from source: play vars 11579 1726882184.88183: variable 'controller_profile' from source: play vars 11579 1726882184.88256: variable 'controller_profile' from source: play vars 11579 1726882184.88325: variable '__network_service_name_default_initscripts' from source: role '' defaults 11579 1726882184.88392: variable '__network_service_name_default_initscripts' from source: role '' defaults 11579 1726882184.88410: variable '__network_packages_default_initscripts' from source: role '' defaults 11579 1726882184.88500: variable '__network_packages_default_initscripts' from source: role '' defaults 11579 1726882184.88703: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11579 1726882184.89186: variable 'network_connections' from source: task vars 11579 1726882184.89202: variable 'controller_profile' from source: play vars 11579 1726882184.89302: variable 'controller_profile' from source: play vars 11579 1726882184.89310: variable 'controller_device' from source: play vars 11579 1726882184.89360: variable 'controller_device' from source: play vars 11579 1726882184.89374: variable 'port1_profile' from source: play vars 11579 1726882184.89439: variable 'port1_profile' from source: play vars 11579 1726882184.89501: variable 'dhcp_interface1' from source: play vars 11579 1726882184.89521: variable 'dhcp_interface1' from source: play vars 11579 1726882184.89533: variable 'controller_profile' from source: play vars 11579 1726882184.89601: variable 'controller_profile' from source: play vars 11579 1726882184.89613: variable 'port2_profile' from source: play vars 11579 1726882184.89677: variable 'port2_profile' from source: play vars 11579 1726882184.89689: variable 'dhcp_interface2' from source: play vars 11579 1726882184.89753: variable 'dhcp_interface2' from source: play vars 11579 1726882184.89765: variable 'controller_profile' from source: play vars 11579 1726882184.89890: variable 'controller_profile' from source: play vars 11579 1726882184.89896: variable 'ansible_distribution' from source: facts 11579 1726882184.89899: variable '__network_rh_distros' from source: role '' defaults 11579 1726882184.89902: variable 'ansible_distribution_major_version' from source: facts 11579 1726882184.89904: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11579 1726882184.90054: variable 'ansible_distribution' from source: facts 11579 1726882184.90063: variable '__network_rh_distros' from source: role '' defaults 11579 1726882184.90073: variable 'ansible_distribution_major_version' from source: facts 11579 1726882184.90091: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11579 1726882184.90262: variable 'ansible_distribution' from source: facts 11579 1726882184.90270: variable '__network_rh_distros' from source: role '' defaults 11579 1726882184.90277: variable 'ansible_distribution_major_version' from source: facts 11579 1726882184.90326: variable 'network_provider' from source: set_fact 11579 1726882184.90500: variable 'ansible_facts' from source: unknown 11579 1726882184.91018: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11579 1726882184.91027: when evaluation is False, skipping this task 11579 1726882184.91034: _execute() done 11579 1726882184.91043: dumping result to json 11579 1726882184.91052: done dumping result, returning 11579 1726882184.91064: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-f197-7423-00000000002e] 11579 1726882184.91074: sending task result for task 12673a56-9f93-f197-7423-00000000002e 11579 1726882184.91329: done sending task result for task 12673a56-9f93-f197-7423-00000000002e 11579 1726882184.91333: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11579 1726882184.91382: no more pending results, returning what we have 11579 1726882184.91386: results queue empty 11579 1726882184.91386: checking for any_errors_fatal 11579 1726882184.91396: done checking for any_errors_fatal 11579 1726882184.91397: checking for max_fail_percentage 11579 1726882184.91399: done checking for max_fail_percentage 11579 1726882184.91400: checking to see if all hosts have failed and the running result is not ok 11579 1726882184.91401: done checking to see if all hosts have failed 11579 1726882184.91402: getting the remaining hosts for this loop 11579 1726882184.91403: done getting the remaining hosts for this loop 11579 1726882184.91407: getting the next task for host managed_node1 11579 1726882184.91414: done getting next task for host managed_node1 11579 1726882184.91417: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11579 1726882184.91421: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882184.91436: getting variables 11579 1726882184.91438: in VariableManager get_vars() 11579 1726882184.91479: Calling all_inventory to load vars for managed_node1 11579 1726882184.91482: Calling groups_inventory to load vars for managed_node1 11579 1726882184.91485: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882184.91678: Calling all_plugins_play to load vars for managed_node1 11579 1726882184.91683: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882184.91687: Calling groups_plugins_play to load vars for managed_node1 11579 1726882184.93054: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882184.94586: done with get_vars() 11579 1726882184.94615: done getting variables 11579 1726882184.94670: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:29:44 -0400 (0:00:00.156) 0:00:13.655 ****** 11579 1726882184.94703: entering _queue_task() for managed_node1/package 11579 1726882184.95120: worker is 1 (out of 1 available) 11579 1726882184.95130: exiting _queue_task() for managed_node1/package 11579 1726882184.95141: done queuing things up, now waiting for results queue to drain 11579 1726882184.95142: waiting for pending results... 11579 1726882184.95318: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11579 1726882184.95477: in run() - task 12673a56-9f93-f197-7423-00000000002f 11579 1726882184.95482: variable 'ansible_search_path' from source: unknown 11579 1726882184.95485: variable 'ansible_search_path' from source: unknown 11579 1726882184.95518: calling self._execute() 11579 1726882184.95696: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882184.95702: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882184.95705: variable 'omit' from source: magic vars 11579 1726882184.96002: variable 'ansible_distribution_major_version' from source: facts 11579 1726882184.96019: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882184.96152: variable 'network_state' from source: role '' defaults 11579 1726882184.96167: Evaluated conditional (network_state != {}): False 11579 1726882184.96175: when evaluation is False, skipping this task 11579 1726882184.96182: _execute() done 11579 1726882184.96189: dumping result to json 11579 1726882184.96202: done dumping result, returning 11579 1726882184.96214: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-f197-7423-00000000002f] 11579 1726882184.96225: sending task result for task 12673a56-9f93-f197-7423-00000000002f skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11579 1726882184.96407: no more pending results, returning what we have 11579 1726882184.96411: results queue empty 11579 1726882184.96412: checking for any_errors_fatal 11579 1726882184.96416: done checking for any_errors_fatal 11579 1726882184.96417: checking for max_fail_percentage 11579 1726882184.96419: done checking for max_fail_percentage 11579 1726882184.96421: checking to see if all hosts have failed and the running result is not ok 11579 1726882184.96422: done checking to see if all hosts have failed 11579 1726882184.96422: getting the remaining hosts for this loop 11579 1726882184.96424: done getting the remaining hosts for this loop 11579 1726882184.96427: getting the next task for host managed_node1 11579 1726882184.96435: done getting next task for host managed_node1 11579 1726882184.96438: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11579 1726882184.96441: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882184.96457: getting variables 11579 1726882184.96459: in VariableManager get_vars() 11579 1726882184.96504: Calling all_inventory to load vars for managed_node1 11579 1726882184.96507: Calling groups_inventory to load vars for managed_node1 11579 1726882184.96509: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882184.96521: Calling all_plugins_play to load vars for managed_node1 11579 1726882184.96524: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882184.96527: Calling groups_plugins_play to load vars for managed_node1 11579 1726882184.97307: done sending task result for task 12673a56-9f93-f197-7423-00000000002f 11579 1726882184.97311: WORKER PROCESS EXITING 11579 1726882184.98044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882185.03558: done with get_vars() 11579 1726882185.03581: done getting variables 11579 1726882185.03635: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:29:45 -0400 (0:00:00.089) 0:00:13.745 ****** 11579 1726882185.03666: entering _queue_task() for managed_node1/package 11579 1726882185.04021: worker is 1 (out of 1 available) 11579 1726882185.04034: exiting _queue_task() for managed_node1/package 11579 1726882185.04046: done queuing things up, now waiting for results queue to drain 11579 1726882185.04047: waiting for pending results... 11579 1726882185.04318: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11579 1726882185.04451: in run() - task 12673a56-9f93-f197-7423-000000000030 11579 1726882185.04472: variable 'ansible_search_path' from source: unknown 11579 1726882185.04480: variable 'ansible_search_path' from source: unknown 11579 1726882185.04528: calling self._execute() 11579 1726882185.04629: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882185.04642: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882185.04656: variable 'omit' from source: magic vars 11579 1726882185.05041: variable 'ansible_distribution_major_version' from source: facts 11579 1726882185.05062: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882185.05192: variable 'network_state' from source: role '' defaults 11579 1726882185.05212: Evaluated conditional (network_state != {}): False 11579 1726882185.05221: when evaluation is False, skipping this task 11579 1726882185.05228: _execute() done 11579 1726882185.05236: dumping result to json 11579 1726882185.05243: done dumping result, returning 11579 1726882185.05256: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-f197-7423-000000000030] 11579 1726882185.05267: sending task result for task 12673a56-9f93-f197-7423-000000000030 11579 1726882185.05383: done sending task result for task 12673a56-9f93-f197-7423-000000000030 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11579 1726882185.05438: no more pending results, returning what we have 11579 1726882185.05441: results queue empty 11579 1726882185.05442: checking for any_errors_fatal 11579 1726882185.05451: done checking for any_errors_fatal 11579 1726882185.05452: checking for max_fail_percentage 11579 1726882185.05453: done checking for max_fail_percentage 11579 1726882185.05454: checking to see if all hosts have failed and the running result is not ok 11579 1726882185.05455: done checking to see if all hosts have failed 11579 1726882185.05456: getting the remaining hosts for this loop 11579 1726882185.05458: done getting the remaining hosts for this loop 11579 1726882185.05461: getting the next task for host managed_node1 11579 1726882185.05469: done getting next task for host managed_node1 11579 1726882185.05473: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11579 1726882185.05476: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882185.05496: getting variables 11579 1726882185.05498: in VariableManager get_vars() 11579 1726882185.05541: Calling all_inventory to load vars for managed_node1 11579 1726882185.05543: Calling groups_inventory to load vars for managed_node1 11579 1726882185.05546: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882185.05557: Calling all_plugins_play to load vars for managed_node1 11579 1726882185.05560: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882185.05563: Calling groups_plugins_play to load vars for managed_node1 11579 1726882185.06308: WORKER PROCESS EXITING 11579 1726882185.07188: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882185.08740: done with get_vars() 11579 1726882185.08764: done getting variables 11579 1726882185.08859: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=False, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:29:45 -0400 (0:00:00.052) 0:00:13.797 ****** 11579 1726882185.08896: entering _queue_task() for managed_node1/service 11579 1726882185.08898: Creating lock for service 11579 1726882185.09329: worker is 1 (out of 1 available) 11579 1726882185.09342: exiting _queue_task() for managed_node1/service 11579 1726882185.09353: done queuing things up, now waiting for results queue to drain 11579 1726882185.09354: waiting for pending results... 11579 1726882185.09537: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11579 1726882185.09678: in run() - task 12673a56-9f93-f197-7423-000000000031 11579 1726882185.09708: variable 'ansible_search_path' from source: unknown 11579 1726882185.09717: variable 'ansible_search_path' from source: unknown 11579 1726882185.09759: calling self._execute() 11579 1726882185.09858: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882185.09873: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882185.09888: variable 'omit' from source: magic vars 11579 1726882185.10278: variable 'ansible_distribution_major_version' from source: facts 11579 1726882185.10300: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882185.10424: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882185.10628: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882185.12767: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882185.12855: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882185.12901: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882185.12943: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882185.12978: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882185.13067: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882185.13108: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882185.13140: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882185.13190: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882185.13215: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882185.13291: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882185.13303: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882185.13335: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882185.13401: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882185.13405: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882185.13445: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882185.13473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882185.13617: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882185.13621: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882185.13624: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882185.13756: variable 'network_connections' from source: task vars 11579 1726882185.13774: variable 'controller_profile' from source: play vars 11579 1726882185.13852: variable 'controller_profile' from source: play vars 11579 1726882185.13868: variable 'controller_device' from source: play vars 11579 1726882185.13934: variable 'controller_device' from source: play vars 11579 1726882185.13955: variable 'port1_profile' from source: play vars 11579 1726882185.14019: variable 'port1_profile' from source: play vars 11579 1726882185.14033: variable 'dhcp_interface1' from source: play vars 11579 1726882185.14101: variable 'dhcp_interface1' from source: play vars 11579 1726882185.14114: variable 'controller_profile' from source: play vars 11579 1726882185.14173: variable 'controller_profile' from source: play vars 11579 1726882185.14183: variable 'port2_profile' from source: play vars 11579 1726882185.14246: variable 'port2_profile' from source: play vars 11579 1726882185.14260: variable 'dhcp_interface2' from source: play vars 11579 1726882185.14329: variable 'dhcp_interface2' from source: play vars 11579 1726882185.14341: variable 'controller_profile' from source: play vars 11579 1726882185.14409: variable 'controller_profile' from source: play vars 11579 1726882185.14474: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882185.15027: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882185.15034: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882185.15069: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882185.15108: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882185.15162: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882185.15187: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882185.15220: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882185.15299: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882185.15330: variable '__network_team_connections_defined' from source: role '' defaults 11579 1726882185.15588: variable 'network_connections' from source: task vars 11579 1726882185.15604: variable 'controller_profile' from source: play vars 11579 1726882185.15671: variable 'controller_profile' from source: play vars 11579 1726882185.15688: variable 'controller_device' from source: play vars 11579 1726882185.15753: variable 'controller_device' from source: play vars 11579 1726882185.15789: variable 'port1_profile' from source: play vars 11579 1726882185.15835: variable 'port1_profile' from source: play vars 11579 1726882185.15848: variable 'dhcp_interface1' from source: play vars 11579 1726882185.15918: variable 'dhcp_interface1' from source: play vars 11579 1726882185.16010: variable 'controller_profile' from source: play vars 11579 1726882185.16013: variable 'controller_profile' from source: play vars 11579 1726882185.16016: variable 'port2_profile' from source: play vars 11579 1726882185.16069: variable 'port2_profile' from source: play vars 11579 1726882185.16085: variable 'dhcp_interface2' from source: play vars 11579 1726882185.16156: variable 'dhcp_interface2' from source: play vars 11579 1726882185.16169: variable 'controller_profile' from source: play vars 11579 1726882185.16239: variable 'controller_profile' from source: play vars 11579 1726882185.16277: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11579 1726882185.16286: when evaluation is False, skipping this task 11579 1726882185.16297: _execute() done 11579 1726882185.16306: dumping result to json 11579 1726882185.16314: done dumping result, returning 11579 1726882185.16325: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-f197-7423-000000000031] 11579 1726882185.16340: sending task result for task 12673a56-9f93-f197-7423-000000000031 11579 1726882185.16606: done sending task result for task 12673a56-9f93-f197-7423-000000000031 11579 1726882185.16609: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11579 1726882185.16657: no more pending results, returning what we have 11579 1726882185.16661: results queue empty 11579 1726882185.16661: checking for any_errors_fatal 11579 1726882185.16668: done checking for any_errors_fatal 11579 1726882185.16669: checking for max_fail_percentage 11579 1726882185.16671: done checking for max_fail_percentage 11579 1726882185.16672: checking to see if all hosts have failed and the running result is not ok 11579 1726882185.16673: done checking to see if all hosts have failed 11579 1726882185.16673: getting the remaining hosts for this loop 11579 1726882185.16675: done getting the remaining hosts for this loop 11579 1726882185.16678: getting the next task for host managed_node1 11579 1726882185.16686: done getting next task for host managed_node1 11579 1726882185.16690: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11579 1726882185.16697: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882185.16712: getting variables 11579 1726882185.16714: in VariableManager get_vars() 11579 1726882185.16759: Calling all_inventory to load vars for managed_node1 11579 1726882185.16762: Calling groups_inventory to load vars for managed_node1 11579 1726882185.16764: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882185.16775: Calling all_plugins_play to load vars for managed_node1 11579 1726882185.16778: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882185.16781: Calling groups_plugins_play to load vars for managed_node1 11579 1726882185.18399: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882185.19940: done with get_vars() 11579 1726882185.19961: done getting variables 11579 1726882185.20027: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:29:45 -0400 (0:00:00.111) 0:00:13.909 ****** 11579 1726882185.20060: entering _queue_task() for managed_node1/service 11579 1726882185.20480: worker is 1 (out of 1 available) 11579 1726882185.20497: exiting _queue_task() for managed_node1/service 11579 1726882185.20508: done queuing things up, now waiting for results queue to drain 11579 1726882185.20509: waiting for pending results... 11579 1726882185.20789: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11579 1726882185.20845: in run() - task 12673a56-9f93-f197-7423-000000000032 11579 1726882185.20867: variable 'ansible_search_path' from source: unknown 11579 1726882185.20875: variable 'ansible_search_path' from source: unknown 11579 1726882185.20927: calling self._execute() 11579 1726882185.21123: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882185.21127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882185.21129: variable 'omit' from source: magic vars 11579 1726882185.21440: variable 'ansible_distribution_major_version' from source: facts 11579 1726882185.21449: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882185.21561: variable 'network_provider' from source: set_fact 11579 1726882185.21564: variable 'network_state' from source: role '' defaults 11579 1726882185.21577: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11579 1726882185.21580: variable 'omit' from source: magic vars 11579 1726882185.21617: variable 'omit' from source: magic vars 11579 1726882185.21638: variable 'network_service_name' from source: role '' defaults 11579 1726882185.21685: variable 'network_service_name' from source: role '' defaults 11579 1726882185.21762: variable '__network_provider_setup' from source: role '' defaults 11579 1726882185.21765: variable '__network_service_name_default_nm' from source: role '' defaults 11579 1726882185.21818: variable '__network_service_name_default_nm' from source: role '' defaults 11579 1726882185.21825: variable '__network_packages_default_nm' from source: role '' defaults 11579 1726882185.21867: variable '__network_packages_default_nm' from source: role '' defaults 11579 1726882185.22016: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882185.23698: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882185.23825: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882185.23828: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882185.23866: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882185.23898: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882185.23991: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882185.24043: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882185.24061: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882185.24086: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882185.24106: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882185.24145: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882185.24167: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882185.24184: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882185.24213: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882185.24223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882185.24374: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11579 1726882185.24451: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882185.24475: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882185.24489: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882185.24517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882185.24528: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882185.24588: variable 'ansible_python' from source: facts 11579 1726882185.24607: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11579 1726882185.24705: variable '__network_wpa_supplicant_required' from source: role '' defaults 11579 1726882185.24800: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11579 1726882185.24889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882185.24931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882185.24957: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882185.25001: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882185.25027: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882185.25077: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882185.25198: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882185.25203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882185.25205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882185.25216: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882185.25373: variable 'network_connections' from source: task vars 11579 1726882185.25386: variable 'controller_profile' from source: play vars 11579 1726882185.25472: variable 'controller_profile' from source: play vars 11579 1726882185.25491: variable 'controller_device' from source: play vars 11579 1726882185.25581: variable 'controller_device' from source: play vars 11579 1726882185.25612: variable 'port1_profile' from source: play vars 11579 1726882185.25703: variable 'port1_profile' from source: play vars 11579 1726882185.25756: variable 'dhcp_interface1' from source: play vars 11579 1726882185.25798: variable 'dhcp_interface1' from source: play vars 11579 1726882185.25823: variable 'controller_profile' from source: play vars 11579 1726882185.26009: variable 'controller_profile' from source: play vars 11579 1726882185.26012: variable 'port2_profile' from source: play vars 11579 1726882185.26240: variable 'port2_profile' from source: play vars 11579 1726882185.26307: variable 'dhcp_interface2' from source: play vars 11579 1726882185.26516: variable 'dhcp_interface2' from source: play vars 11579 1726882185.26518: variable 'controller_profile' from source: play vars 11579 1726882185.26583: variable 'controller_profile' from source: play vars 11579 1726882185.26803: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882185.27034: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882185.27082: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882185.27128: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882185.27168: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882185.27232: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882185.27267: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882185.27296: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882185.27328: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882185.27377: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882185.27645: variable 'network_connections' from source: task vars 11579 1726882185.27652: variable 'controller_profile' from source: play vars 11579 1726882185.27723: variable 'controller_profile' from source: play vars 11579 1726882185.27734: variable 'controller_device' from source: play vars 11579 1726882185.27810: variable 'controller_device' from source: play vars 11579 1726882185.27817: variable 'port1_profile' from source: play vars 11579 1726882185.27884: variable 'port1_profile' from source: play vars 11579 1726882185.27899: variable 'dhcp_interface1' from source: play vars 11579 1726882185.27967: variable 'dhcp_interface1' from source: play vars 11579 1726882185.27978: variable 'controller_profile' from source: play vars 11579 1726882185.28200: variable 'controller_profile' from source: play vars 11579 1726882185.28203: variable 'port2_profile' from source: play vars 11579 1726882185.28205: variable 'port2_profile' from source: play vars 11579 1726882185.28207: variable 'dhcp_interface2' from source: play vars 11579 1726882185.28210: variable 'dhcp_interface2' from source: play vars 11579 1726882185.28219: variable 'controller_profile' from source: play vars 11579 1726882185.28282: variable 'controller_profile' from source: play vars 11579 1726882185.28333: variable '__network_packages_default_wireless' from source: role '' defaults 11579 1726882185.28410: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882185.29135: variable 'network_connections' from source: task vars 11579 1726882185.29139: variable 'controller_profile' from source: play vars 11579 1726882185.29207: variable 'controller_profile' from source: play vars 11579 1726882185.29214: variable 'controller_device' from source: play vars 11579 1726882185.29281: variable 'controller_device' from source: play vars 11579 1726882185.29289: variable 'port1_profile' from source: play vars 11579 1726882185.29562: variable 'port1_profile' from source: play vars 11579 1726882185.29569: variable 'dhcp_interface1' from source: play vars 11579 1726882185.29748: variable 'dhcp_interface1' from source: play vars 11579 1726882185.29751: variable 'controller_profile' from source: play vars 11579 1726882185.29918: variable 'controller_profile' from source: play vars 11579 1726882185.29925: variable 'port2_profile' from source: play vars 11579 1726882185.29990: variable 'port2_profile' from source: play vars 11579 1726882185.29998: variable 'dhcp_interface2' from source: play vars 11579 1726882185.30076: variable 'dhcp_interface2' from source: play vars 11579 1726882185.30080: variable 'controller_profile' from source: play vars 11579 1726882185.30291: variable 'controller_profile' from source: play vars 11579 1726882185.30449: variable '__network_packages_default_team' from source: role '' defaults 11579 1726882185.30532: variable '__network_team_connections_defined' from source: role '' defaults 11579 1726882185.31139: variable 'network_connections' from source: task vars 11579 1726882185.31143: variable 'controller_profile' from source: play vars 11579 1726882185.31413: variable 'controller_profile' from source: play vars 11579 1726882185.31599: variable 'controller_device' from source: play vars 11579 1726882185.31603: variable 'controller_device' from source: play vars 11579 1726882185.31605: variable 'port1_profile' from source: play vars 11579 1726882185.31659: variable 'port1_profile' from source: play vars 11579 1726882185.31666: variable 'dhcp_interface1' from source: play vars 11579 1726882185.31912: variable 'dhcp_interface1' from source: play vars 11579 1726882185.31918: variable 'controller_profile' from source: play vars 11579 1726882185.31984: variable 'controller_profile' from source: play vars 11579 1726882185.31990: variable 'port2_profile' from source: play vars 11579 1726882185.32060: variable 'port2_profile' from source: play vars 11579 1726882185.32066: variable 'dhcp_interface2' from source: play vars 11579 1726882185.32535: variable 'dhcp_interface2' from source: play vars 11579 1726882185.32541: variable 'controller_profile' from source: play vars 11579 1726882185.32812: variable 'controller_profile' from source: play vars 11579 1726882185.32873: variable '__network_service_name_default_initscripts' from source: role '' defaults 11579 1726882185.32931: variable '__network_service_name_default_initscripts' from source: role '' defaults 11579 1726882185.32938: variable '__network_packages_default_initscripts' from source: role '' defaults 11579 1726882185.32997: variable '__network_packages_default_initscripts' from source: role '' defaults 11579 1726882185.33582: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11579 1726882185.34552: variable 'network_connections' from source: task vars 11579 1726882185.34556: variable 'controller_profile' from source: play vars 11579 1726882185.34618: variable 'controller_profile' from source: play vars 11579 1726882185.34625: variable 'controller_device' from source: play vars 11579 1726882185.34682: variable 'controller_device' from source: play vars 11579 1726882185.34689: variable 'port1_profile' from source: play vars 11579 1726882185.34951: variable 'port1_profile' from source: play vars 11579 1726882185.34957: variable 'dhcp_interface1' from source: play vars 11579 1726882185.35100: variable 'dhcp_interface1' from source: play vars 11579 1726882185.35104: variable 'controller_profile' from source: play vars 11579 1726882185.35106: variable 'controller_profile' from source: play vars 11579 1726882185.35108: variable 'port2_profile' from source: play vars 11579 1726882185.35338: variable 'port2_profile' from source: play vars 11579 1726882185.35345: variable 'dhcp_interface2' from source: play vars 11579 1726882185.35403: variable 'dhcp_interface2' from source: play vars 11579 1726882185.35409: variable 'controller_profile' from source: play vars 11579 1726882185.35463: variable 'controller_profile' from source: play vars 11579 1726882185.35471: variable 'ansible_distribution' from source: facts 11579 1726882185.35474: variable '__network_rh_distros' from source: role '' defaults 11579 1726882185.35499: variable 'ansible_distribution_major_version' from source: facts 11579 1726882185.35711: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11579 1726882185.35988: variable 'ansible_distribution' from source: facts 11579 1726882185.35991: variable '__network_rh_distros' from source: role '' defaults 11579 1726882185.35997: variable 'ansible_distribution_major_version' from source: facts 11579 1726882185.36299: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11579 1726882185.36482: variable 'ansible_distribution' from source: facts 11579 1726882185.36486: variable '__network_rh_distros' from source: role '' defaults 11579 1726882185.36491: variable 'ansible_distribution_major_version' from source: facts 11579 1726882185.36531: variable 'network_provider' from source: set_fact 11579 1726882185.36555: variable 'omit' from source: magic vars 11579 1726882185.36583: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882185.36614: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882185.36632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882185.36648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882185.36659: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882185.36689: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882185.36696: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882185.36699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882185.36998: Set connection var ansible_timeout to 10 11579 1726882185.37001: Set connection var ansible_shell_type to sh 11579 1726882185.37011: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882185.37016: Set connection var ansible_shell_executable to /bin/sh 11579 1726882185.37023: Set connection var ansible_pipelining to False 11579 1726882185.37026: Set connection var ansible_connection to ssh 11579 1726882185.37049: variable 'ansible_shell_executable' from source: unknown 11579 1726882185.37052: variable 'ansible_connection' from source: unknown 11579 1726882185.37055: variable 'ansible_module_compression' from source: unknown 11579 1726882185.37057: variable 'ansible_shell_type' from source: unknown 11579 1726882185.37059: variable 'ansible_shell_executable' from source: unknown 11579 1726882185.37069: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882185.37071: variable 'ansible_pipelining' from source: unknown 11579 1726882185.37074: variable 'ansible_timeout' from source: unknown 11579 1726882185.37075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882185.37374: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882185.37383: variable 'omit' from source: magic vars 11579 1726882185.37502: starting attempt loop 11579 1726882185.37506: running the handler 11579 1726882185.37508: variable 'ansible_facts' from source: unknown 11579 1726882185.39299: _low_level_execute_command(): starting 11579 1726882185.39302: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882185.40453: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882185.40457: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882185.40461: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882185.40632: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882185.40645: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882185.40712: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882185.40779: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882185.42523: stdout chunk (state=3): >>>/root <<< 11579 1726882185.42655: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882185.42659: stdout chunk (state=3): >>><<< 11579 1726882185.42669: stderr chunk (state=3): >>><<< 11579 1726882185.42743: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882185.42762: _low_level_execute_command(): starting 11579 1726882185.42772: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882185.4274523-12257-189166580188120 `" && echo ansible-tmp-1726882185.4274523-12257-189166580188120="` echo /root/.ansible/tmp/ansible-tmp-1726882185.4274523-12257-189166580188120 `" ) && sleep 0' 11579 1726882185.43438: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882185.43500: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882185.43507: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882185.43555: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882185.43607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882185.45456: stdout chunk (state=3): >>>ansible-tmp-1726882185.4274523-12257-189166580188120=/root/.ansible/tmp/ansible-tmp-1726882185.4274523-12257-189166580188120 <<< 11579 1726882185.45760: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882185.45763: stdout chunk (state=3): >>><<< 11579 1726882185.45765: stderr chunk (state=3): >>><<< 11579 1726882185.45767: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882185.4274523-12257-189166580188120=/root/.ansible/tmp/ansible-tmp-1726882185.4274523-12257-189166580188120 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882185.45772: variable 'ansible_module_compression' from source: unknown 11579 1726882185.45775: ANSIBALLZ: Using generic lock for ansible.legacy.systemd 11579 1726882185.45777: ANSIBALLZ: Acquiring lock 11579 1726882185.45779: ANSIBALLZ: Lock acquired: 139873763448672 11579 1726882185.45781: ANSIBALLZ: Creating module 11579 1726882185.68457: ANSIBALLZ: Writing module into payload 11579 1726882185.68560: ANSIBALLZ: Writing module 11579 1726882185.68581: ANSIBALLZ: Renaming module 11579 1726882185.68587: ANSIBALLZ: Done creating module 11579 1726882185.68620: variable 'ansible_facts' from source: unknown 11579 1726882185.68750: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882185.4274523-12257-189166580188120/AnsiballZ_systemd.py 11579 1726882185.68856: Sending initial data 11579 1726882185.68859: Sent initial data (156 bytes) 11579 1726882185.69299: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882185.69331: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882185.69334: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 11579 1726882185.69336: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882185.69339: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 11579 1726882185.69341: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882185.69343: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882185.69396: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882185.69401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882185.69404: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882185.69452: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882185.70988: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11579 1726882185.70997: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882185.71027: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882185.71069: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpxcrsgw99 /root/.ansible/tmp/ansible-tmp-1726882185.4274523-12257-189166580188120/AnsiballZ_systemd.py <<< 11579 1726882185.71082: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882185.4274523-12257-189166580188120/AnsiballZ_systemd.py" <<< 11579 1726882185.71110: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpxcrsgw99" to remote "/root/.ansible/tmp/ansible-tmp-1726882185.4274523-12257-189166580188120/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882185.4274523-12257-189166580188120/AnsiballZ_systemd.py" <<< 11579 1726882185.72148: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882185.72191: stderr chunk (state=3): >>><<< 11579 1726882185.72196: stdout chunk (state=3): >>><<< 11579 1726882185.72229: done transferring module to remote 11579 1726882185.72238: _low_level_execute_command(): starting 11579 1726882185.72242: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882185.4274523-12257-189166580188120/ /root/.ansible/tmp/ansible-tmp-1726882185.4274523-12257-189166580188120/AnsiballZ_systemd.py && sleep 0' 11579 1726882185.72670: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882185.72673: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 11579 1726882185.72676: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11579 1726882185.72678: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882185.72680: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882185.72730: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882185.72733: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882185.72777: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882185.74500: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882185.74522: stderr chunk (state=3): >>><<< 11579 1726882185.74525: stdout chunk (state=3): >>><<< 11579 1726882185.74541: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882185.74544: _low_level_execute_command(): starting 11579 1726882185.74549: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882185.4274523-12257-189166580188120/AnsiballZ_systemd.py && sleep 0' 11579 1726882185.74954: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882185.74957: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882185.74976: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882185.75025: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882185.75028: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882185.75081: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882186.03641: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10002432", "MemoryPeak": "10530816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3319226368", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "382925000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 11579 1726882186.03663: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11579 1726882186.05424: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882186.05448: stderr chunk (state=3): >>><<< 11579 1726882186.05451: stdout chunk (state=3): >>><<< 11579 1726882186.05468: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10002432", "MemoryPeak": "10530816", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3319226368", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "382925000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882186.05586: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882185.4274523-12257-189166580188120/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882186.05605: _low_level_execute_command(): starting 11579 1726882186.05608: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882185.4274523-12257-189166580188120/ > /dev/null 2>&1 && sleep 0' 11579 1726882186.06176: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882186.06180: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882186.06183: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882186.06185: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882186.06233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882186.06268: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882186.08204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882186.08208: stdout chunk (state=3): >>><<< 11579 1726882186.08210: stderr chunk (state=3): >>><<< 11579 1726882186.08212: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882186.08215: handler run complete 11579 1726882186.08217: attempt loop complete, returning result 11579 1726882186.08219: _execute() done 11579 1726882186.08220: dumping result to json 11579 1726882186.08235: done dumping result, returning 11579 1726882186.08248: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-f197-7423-000000000032] 11579 1726882186.08256: sending task result for task 12673a56-9f93-f197-7423-000000000032 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11579 1726882186.09744: no more pending results, returning what we have 11579 1726882186.09748: results queue empty 11579 1726882186.09748: checking for any_errors_fatal 11579 1726882186.09753: done checking for any_errors_fatal 11579 1726882186.09754: checking for max_fail_percentage 11579 1726882186.09755: done checking for max_fail_percentage 11579 1726882186.09756: checking to see if all hosts have failed and the running result is not ok 11579 1726882186.09758: done checking to see if all hosts have failed 11579 1726882186.09759: getting the remaining hosts for this loop 11579 1726882186.09760: done getting the remaining hosts for this loop 11579 1726882186.09764: getting the next task for host managed_node1 11579 1726882186.09770: done getting next task for host managed_node1 11579 1726882186.09773: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11579 1726882186.09776: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882186.09786: getting variables 11579 1726882186.09788: in VariableManager get_vars() 11579 1726882186.09831: Calling all_inventory to load vars for managed_node1 11579 1726882186.09834: Calling groups_inventory to load vars for managed_node1 11579 1726882186.09836: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882186.09847: Calling all_plugins_play to load vars for managed_node1 11579 1726882186.09850: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882186.09854: Calling groups_plugins_play to load vars for managed_node1 11579 1726882186.10411: done sending task result for task 12673a56-9f93-f197-7423-000000000032 11579 1726882186.10415: WORKER PROCESS EXITING 11579 1726882186.11454: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882186.13048: done with get_vars() 11579 1726882186.13072: done getting variables 11579 1726882186.13131: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:29:46 -0400 (0:00:00.931) 0:00:14.840 ****** 11579 1726882186.13168: entering _queue_task() for managed_node1/service 11579 1726882186.13705: worker is 1 (out of 1 available) 11579 1726882186.13712: exiting _queue_task() for managed_node1/service 11579 1726882186.13722: done queuing things up, now waiting for results queue to drain 11579 1726882186.13724: waiting for pending results... 11579 1726882186.13785: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11579 1726882186.13915: in run() - task 12673a56-9f93-f197-7423-000000000033 11579 1726882186.13938: variable 'ansible_search_path' from source: unknown 11579 1726882186.13953: variable 'ansible_search_path' from source: unknown 11579 1726882186.13995: calling self._execute() 11579 1726882186.14092: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882186.14107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882186.14121: variable 'omit' from source: magic vars 11579 1726882186.14513: variable 'ansible_distribution_major_version' from source: facts 11579 1726882186.14531: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882186.14651: variable 'network_provider' from source: set_fact 11579 1726882186.14698: Evaluated conditional (network_provider == "nm"): True 11579 1726882186.14749: variable '__network_wpa_supplicant_required' from source: role '' defaults 11579 1726882186.14836: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11579 1726882186.15019: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882186.17432: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882186.17506: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882186.17551: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882186.17588: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882186.17624: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882186.17708: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882186.17760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882186.17779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882186.17869: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882186.17872: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882186.17902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882186.17934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882186.17961: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882186.18010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882186.18029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882186.18075: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882186.18151: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882186.18154: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882186.18181: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882186.18209: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882186.18353: variable 'network_connections' from source: task vars 11579 1726882186.18374: variable 'controller_profile' from source: play vars 11579 1726882186.18454: variable 'controller_profile' from source: play vars 11579 1726882186.18470: variable 'controller_device' from source: play vars 11579 1726882186.18545: variable 'controller_device' from source: play vars 11579 1726882186.18562: variable 'port1_profile' from source: play vars 11579 1726882186.18656: variable 'port1_profile' from source: play vars 11579 1726882186.18668: variable 'dhcp_interface1' from source: play vars 11579 1726882186.18843: variable 'dhcp_interface1' from source: play vars 11579 1726882186.18847: variable 'controller_profile' from source: play vars 11579 1726882186.18849: variable 'controller_profile' from source: play vars 11579 1726882186.18851: variable 'port2_profile' from source: play vars 11579 1726882186.18888: variable 'port2_profile' from source: play vars 11579 1726882186.18904: variable 'dhcp_interface2' from source: play vars 11579 1726882186.18977: variable 'dhcp_interface2' from source: play vars 11579 1726882186.18989: variable 'controller_profile' from source: play vars 11579 1726882186.19058: variable 'controller_profile' from source: play vars 11579 1726882186.19142: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882186.19340: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882186.19388: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882186.19430: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882186.19501: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882186.19525: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882186.19552: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882186.19581: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882186.19698: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882186.19701: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882186.19922: variable 'network_connections' from source: task vars 11579 1726882186.19941: variable 'controller_profile' from source: play vars 11579 1726882186.20003: variable 'controller_profile' from source: play vars 11579 1726882186.20014: variable 'controller_device' from source: play vars 11579 1726882186.20080: variable 'controller_device' from source: play vars 11579 1726882186.20095: variable 'port1_profile' from source: play vars 11579 1726882186.20157: variable 'port1_profile' from source: play vars 11579 1726882186.20169: variable 'dhcp_interface1' from source: play vars 11579 1726882186.20227: variable 'dhcp_interface1' from source: play vars 11579 1726882186.20238: variable 'controller_profile' from source: play vars 11579 1726882186.20304: variable 'controller_profile' from source: play vars 11579 1726882186.20372: variable 'port2_profile' from source: play vars 11579 1726882186.20400: variable 'port2_profile' from source: play vars 11579 1726882186.20414: variable 'dhcp_interface2' from source: play vars 11579 1726882186.20484: variable 'dhcp_interface2' from source: play vars 11579 1726882186.20499: variable 'controller_profile' from source: play vars 11579 1726882186.20559: variable 'controller_profile' from source: play vars 11579 1726882186.20619: Evaluated conditional (__network_wpa_supplicant_required): False 11579 1726882186.20630: when evaluation is False, skipping this task 11579 1726882186.20638: _execute() done 11579 1726882186.20645: dumping result to json 11579 1726882186.20701: done dumping result, returning 11579 1726882186.20704: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-f197-7423-000000000033] 11579 1726882186.20706: sending task result for task 12673a56-9f93-f197-7423-000000000033 11579 1726882186.20771: done sending task result for task 12673a56-9f93-f197-7423-000000000033 11579 1726882186.20774: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11579 1726882186.20850: no more pending results, returning what we have 11579 1726882186.20853: results queue empty 11579 1726882186.20854: checking for any_errors_fatal 11579 1726882186.20873: done checking for any_errors_fatal 11579 1726882186.20874: checking for max_fail_percentage 11579 1726882186.20875: done checking for max_fail_percentage 11579 1726882186.20876: checking to see if all hosts have failed and the running result is not ok 11579 1726882186.20877: done checking to see if all hosts have failed 11579 1726882186.20878: getting the remaining hosts for this loop 11579 1726882186.20879: done getting the remaining hosts for this loop 11579 1726882186.20883: getting the next task for host managed_node1 11579 1726882186.20890: done getting next task for host managed_node1 11579 1726882186.20896: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11579 1726882186.20899: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882186.20913: getting variables 11579 1726882186.20915: in VariableManager get_vars() 11579 1726882186.20959: Calling all_inventory to load vars for managed_node1 11579 1726882186.20962: Calling groups_inventory to load vars for managed_node1 11579 1726882186.20964: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882186.20975: Calling all_plugins_play to load vars for managed_node1 11579 1726882186.20977: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882186.20980: Calling groups_plugins_play to load vars for managed_node1 11579 1726882186.22812: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882186.24457: done with get_vars() 11579 1726882186.24481: done getting variables 11579 1726882186.24542: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:29:46 -0400 (0:00:00.114) 0:00:14.954 ****** 11579 1726882186.24579: entering _queue_task() for managed_node1/service 11579 1726882186.25025: worker is 1 (out of 1 available) 11579 1726882186.25036: exiting _queue_task() for managed_node1/service 11579 1726882186.25045: done queuing things up, now waiting for results queue to drain 11579 1726882186.25046: waiting for pending results... 11579 1726882186.25286: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 11579 1726882186.25436: in run() - task 12673a56-9f93-f197-7423-000000000034 11579 1726882186.25440: variable 'ansible_search_path' from source: unknown 11579 1726882186.25443: variable 'ansible_search_path' from source: unknown 11579 1726882186.25460: calling self._execute() 11579 1726882186.25570: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882186.25583: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882186.25603: variable 'omit' from source: magic vars 11579 1726882186.26033: variable 'ansible_distribution_major_version' from source: facts 11579 1726882186.26037: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882186.26169: variable 'network_provider' from source: set_fact 11579 1726882186.26199: Evaluated conditional (network_provider == "initscripts"): False 11579 1726882186.26202: when evaluation is False, skipping this task 11579 1726882186.26205: _execute() done 11579 1726882186.26207: dumping result to json 11579 1726882186.26249: done dumping result, returning 11579 1726882186.26253: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-f197-7423-000000000034] 11579 1726882186.26255: sending task result for task 12673a56-9f93-f197-7423-000000000034 11579 1726882186.26457: done sending task result for task 12673a56-9f93-f197-7423-000000000034 11579 1726882186.26460: WORKER PROCESS EXITING skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11579 1726882186.26507: no more pending results, returning what we have 11579 1726882186.26511: results queue empty 11579 1726882186.26512: checking for any_errors_fatal 11579 1726882186.26524: done checking for any_errors_fatal 11579 1726882186.26526: checking for max_fail_percentage 11579 1726882186.26528: done checking for max_fail_percentage 11579 1726882186.26529: checking to see if all hosts have failed and the running result is not ok 11579 1726882186.26530: done checking to see if all hosts have failed 11579 1726882186.26531: getting the remaining hosts for this loop 11579 1726882186.26532: done getting the remaining hosts for this loop 11579 1726882186.26536: getting the next task for host managed_node1 11579 1726882186.26544: done getting next task for host managed_node1 11579 1726882186.26548: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11579 1726882186.26552: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882186.26568: getting variables 11579 1726882186.26570: in VariableManager get_vars() 11579 1726882186.26618: Calling all_inventory to load vars for managed_node1 11579 1726882186.26622: Calling groups_inventory to load vars for managed_node1 11579 1726882186.26625: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882186.26818: Calling all_plugins_play to load vars for managed_node1 11579 1726882186.26822: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882186.26825: Calling groups_plugins_play to load vars for managed_node1 11579 1726882186.28152: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882186.29929: done with get_vars() 11579 1726882186.29957: done getting variables 11579 1726882186.30027: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:29:46 -0400 (0:00:00.054) 0:00:15.009 ****** 11579 1726882186.30067: entering _queue_task() for managed_node1/copy 11579 1726882186.30522: worker is 1 (out of 1 available) 11579 1726882186.30536: exiting _queue_task() for managed_node1/copy 11579 1726882186.30547: done queuing things up, now waiting for results queue to drain 11579 1726882186.30548: waiting for pending results... 11579 1726882186.30788: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11579 1726882186.30919: in run() - task 12673a56-9f93-f197-7423-000000000035 11579 1726882186.31027: variable 'ansible_search_path' from source: unknown 11579 1726882186.31031: variable 'ansible_search_path' from source: unknown 11579 1726882186.31034: calling self._execute() 11579 1726882186.31089: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882186.31104: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882186.31120: variable 'omit' from source: magic vars 11579 1726882186.31533: variable 'ansible_distribution_major_version' from source: facts 11579 1726882186.31552: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882186.31679: variable 'network_provider' from source: set_fact 11579 1726882186.31697: Evaluated conditional (network_provider == "initscripts"): False 11579 1726882186.31706: when evaluation is False, skipping this task 11579 1726882186.31713: _execute() done 11579 1726882186.31721: dumping result to json 11579 1726882186.31788: done dumping result, returning 11579 1726882186.31795: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-f197-7423-000000000035] 11579 1726882186.31798: sending task result for task 12673a56-9f93-f197-7423-000000000035 11579 1726882186.31864: done sending task result for task 12673a56-9f93-f197-7423-000000000035 11579 1726882186.31866: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11579 1726882186.31915: no more pending results, returning what we have 11579 1726882186.31919: results queue empty 11579 1726882186.31920: checking for any_errors_fatal 11579 1726882186.31926: done checking for any_errors_fatal 11579 1726882186.31927: checking for max_fail_percentage 11579 1726882186.31929: done checking for max_fail_percentage 11579 1726882186.31930: checking to see if all hosts have failed and the running result is not ok 11579 1726882186.31931: done checking to see if all hosts have failed 11579 1726882186.31932: getting the remaining hosts for this loop 11579 1726882186.31934: done getting the remaining hosts for this loop 11579 1726882186.31937: getting the next task for host managed_node1 11579 1726882186.31945: done getting next task for host managed_node1 11579 1726882186.31949: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11579 1726882186.31952: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882186.31968: getting variables 11579 1726882186.31970: in VariableManager get_vars() 11579 1726882186.32017: Calling all_inventory to load vars for managed_node1 11579 1726882186.32020: Calling groups_inventory to load vars for managed_node1 11579 1726882186.32023: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882186.32036: Calling all_plugins_play to load vars for managed_node1 11579 1726882186.32039: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882186.32042: Calling groups_plugins_play to load vars for managed_node1 11579 1726882186.33632: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882186.35253: done with get_vars() 11579 1726882186.35273: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:29:46 -0400 (0:00:00.052) 0:00:15.062 ****** 11579 1726882186.35361: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11579 1726882186.35363: Creating lock for fedora.linux_system_roles.network_connections 11579 1726882186.35778: worker is 1 (out of 1 available) 11579 1726882186.35790: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11579 1726882186.35803: done queuing things up, now waiting for results queue to drain 11579 1726882186.35804: waiting for pending results... 11579 1726882186.36113: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11579 1726882186.36125: in run() - task 12673a56-9f93-f197-7423-000000000036 11579 1726882186.36146: variable 'ansible_search_path' from source: unknown 11579 1726882186.36153: variable 'ansible_search_path' from source: unknown 11579 1726882186.36195: calling self._execute() 11579 1726882186.36298: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882186.36302: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882186.36429: variable 'omit' from source: magic vars 11579 1726882186.36713: variable 'ansible_distribution_major_version' from source: facts 11579 1726882186.36731: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882186.36742: variable 'omit' from source: magic vars 11579 1726882186.36810: variable 'omit' from source: magic vars 11579 1726882186.36980: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882186.39108: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882186.39181: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882186.39226: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882186.39273: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882186.39306: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882186.39394: variable 'network_provider' from source: set_fact 11579 1726882186.39525: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882186.39891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882186.39931: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882186.39975: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882186.39998: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882186.40116: variable 'omit' from source: magic vars 11579 1726882186.40202: variable 'omit' from source: magic vars 11579 1726882186.40312: variable 'network_connections' from source: task vars 11579 1726882186.40332: variable 'controller_profile' from source: play vars 11579 1726882186.40400: variable 'controller_profile' from source: play vars 11579 1726882186.40445: variable 'controller_device' from source: play vars 11579 1726882186.40486: variable 'controller_device' from source: play vars 11579 1726882186.40503: variable 'port1_profile' from source: play vars 11579 1726882186.40571: variable 'port1_profile' from source: play vars 11579 1726882186.40584: variable 'dhcp_interface1' from source: play vars 11579 1726882186.40661: variable 'dhcp_interface1' from source: play vars 11579 1726882186.40664: variable 'controller_profile' from source: play vars 11579 1726882186.40723: variable 'controller_profile' from source: play vars 11579 1726882186.40770: variable 'port2_profile' from source: play vars 11579 1726882186.40805: variable 'port2_profile' from source: play vars 11579 1726882186.40818: variable 'dhcp_interface2' from source: play vars 11579 1726882186.40887: variable 'dhcp_interface2' from source: play vars 11579 1726882186.40901: variable 'controller_profile' from source: play vars 11579 1726882186.40962: variable 'controller_profile' from source: play vars 11579 1726882186.41200: variable 'omit' from source: magic vars 11579 1726882186.41205: variable '__lsr_ansible_managed' from source: task vars 11579 1726882186.41235: variable '__lsr_ansible_managed' from source: task vars 11579 1726882186.41418: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11579 1726882186.41644: Loaded config def from plugin (lookup/template) 11579 1726882186.41654: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11579 1726882186.41684: File lookup term: get_ansible_managed.j2 11579 1726882186.41752: variable 'ansible_search_path' from source: unknown 11579 1726882186.41755: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11579 1726882186.41760: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11579 1726882186.41763: variable 'ansible_search_path' from source: unknown 11579 1726882186.47618: variable 'ansible_managed' from source: unknown 11579 1726882186.47761: variable 'omit' from source: magic vars 11579 1726882186.47890: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882186.47895: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882186.47898: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882186.47900: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882186.47902: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882186.47919: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882186.47928: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882186.47937: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882186.48044: Set connection var ansible_timeout to 10 11579 1726882186.48057: Set connection var ansible_shell_type to sh 11579 1726882186.48070: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882186.48079: Set connection var ansible_shell_executable to /bin/sh 11579 1726882186.48122: Set connection var ansible_pipelining to False 11579 1726882186.48129: Set connection var ansible_connection to ssh 11579 1726882186.48147: variable 'ansible_shell_executable' from source: unknown 11579 1726882186.48156: variable 'ansible_connection' from source: unknown 11579 1726882186.48217: variable 'ansible_module_compression' from source: unknown 11579 1726882186.48220: variable 'ansible_shell_type' from source: unknown 11579 1726882186.48222: variable 'ansible_shell_executable' from source: unknown 11579 1726882186.48224: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882186.48226: variable 'ansible_pipelining' from source: unknown 11579 1726882186.48228: variable 'ansible_timeout' from source: unknown 11579 1726882186.48230: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882186.48344: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11579 1726882186.48363: variable 'omit' from source: magic vars 11579 1726882186.48376: starting attempt loop 11579 1726882186.48382: running the handler 11579 1726882186.48403: _low_level_execute_command(): starting 11579 1726882186.48434: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882186.49199: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882186.49224: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882186.49240: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882186.49330: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882186.50985: stdout chunk (state=3): >>>/root <<< 11579 1726882186.51143: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882186.51148: stdout chunk (state=3): >>><<< 11579 1726882186.51151: stderr chunk (state=3): >>><<< 11579 1726882186.51267: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882186.51271: _low_level_execute_command(): starting 11579 1726882186.51274: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882186.5117517-12293-76254214350970 `" && echo ansible-tmp-1726882186.5117517-12293-76254214350970="` echo /root/.ansible/tmp/ansible-tmp-1726882186.5117517-12293-76254214350970 `" ) && sleep 0' 11579 1726882186.51843: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882186.52014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882186.52121: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882186.53974: stdout chunk (state=3): >>>ansible-tmp-1726882186.5117517-12293-76254214350970=/root/.ansible/tmp/ansible-tmp-1726882186.5117517-12293-76254214350970 <<< 11579 1726882186.54203: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882186.54207: stdout chunk (state=3): >>><<< 11579 1726882186.54214: stderr chunk (state=3): >>><<< 11579 1726882186.54239: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882186.5117517-12293-76254214350970=/root/.ansible/tmp/ansible-tmp-1726882186.5117517-12293-76254214350970 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882186.54286: variable 'ansible_module_compression' from source: unknown 11579 1726882186.54336: ANSIBALLZ: Using lock for fedora.linux_system_roles.network_connections 11579 1726882186.54340: ANSIBALLZ: Acquiring lock 11579 1726882186.54342: ANSIBALLZ: Lock acquired: 139873763327008 11579 1726882186.54348: ANSIBALLZ: Creating module 11579 1726882186.77630: ANSIBALLZ: Writing module into payload 11579 1726882186.77969: ANSIBALLZ: Writing module 11579 1726882186.78000: ANSIBALLZ: Renaming module 11579 1726882186.78198: ANSIBALLZ: Done creating module 11579 1726882186.78201: variable 'ansible_facts' from source: unknown 11579 1726882186.78204: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882186.5117517-12293-76254214350970/AnsiballZ_network_connections.py 11579 1726882186.78338: Sending initial data 11579 1726882186.78348: Sent initial data (167 bytes) 11579 1726882186.78992: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882186.79011: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882186.79099: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882186.79137: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882186.79154: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882186.79167: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882186.79247: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882186.80869: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882186.80937: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882186.81010: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmp1gcoqb7q /root/.ansible/tmp/ansible-tmp-1726882186.5117517-12293-76254214350970/AnsiballZ_network_connections.py <<< 11579 1726882186.81026: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882186.5117517-12293-76254214350970/AnsiballZ_network_connections.py" <<< 11579 1726882186.81056: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmp1gcoqb7q" to remote "/root/.ansible/tmp/ansible-tmp-1726882186.5117517-12293-76254214350970/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882186.5117517-12293-76254214350970/AnsiballZ_network_connections.py" <<< 11579 1726882186.82251: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882186.82263: stderr chunk (state=3): >>><<< 11579 1726882186.82384: stdout chunk (state=3): >>><<< 11579 1726882186.82388: done transferring module to remote 11579 1726882186.82390: _low_level_execute_command(): starting 11579 1726882186.82395: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882186.5117517-12293-76254214350970/ /root/.ansible/tmp/ansible-tmp-1726882186.5117517-12293-76254214350970/AnsiballZ_network_connections.py && sleep 0' 11579 1726882186.83159: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882186.83204: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882186.83222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882186.83245: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882186.83338: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882186.85109: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882186.85127: stdout chunk (state=3): >>><<< 11579 1726882186.85143: stderr chunk (state=3): >>><<< 11579 1726882186.85199: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882186.85202: _low_level_execute_command(): starting 11579 1726882186.85205: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882186.5117517-12293-76254214350970/AnsiballZ_network_connections.py && sleep 0' 11579 1726882186.86201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882186.86210: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882186.86276: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882187.27897: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 93b53e62-64f5-4c39-966c-031f30f8befe\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, f55bc258-3187-4e12-b27d-0cad9097ebf7\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 29d61a64-4b27-4cf7-b22f-65c039402cbb\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 93b53e62-64f5-4c39-966c-031f30f8befe (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, f55bc258-3187-4e12-b27d-0cad9097ebf7 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 29d61a64-4b27-4cf7-b22f-65c039402cbb (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11579 1726882187.30004: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882187.30008: stdout chunk (state=3): >>><<< 11579 1726882187.30010: stderr chunk (state=3): >>><<< 11579 1726882187.30180: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "warnings": [], "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 93b53e62-64f5-4c39-966c-031f30f8befe\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, f55bc258-3187-4e12-b27d-0cad9097ebf7\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 29d61a64-4b27-4cf7-b22f-65c039402cbb\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 93b53e62-64f5-4c39-966c-031f30f8befe (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, f55bc258-3187-4e12-b27d-0cad9097ebf7 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 29d61a64-4b27-4cf7-b22f-65c039402cbb (not-active)\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0", "state": "up", "type": "bond", "interface_name": "nm-bond", "bond": {"mode": "active-backup", "miimon": 110}, "ip": {"route_metric4": 65535}}, {"name": "bond0.0", "state": "up", "type": "ethernet", "interface_name": "test1", "controller": "bond0"}, {"name": "bond0.1", "state": "up", "type": "ethernet", "interface_name": "test2", "controller": "bond0"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882187.30184: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0', 'state': 'up', 'type': 'bond', 'interface_name': 'nm-bond', 'bond': {'mode': 'active-backup', 'miimon': 110}, 'ip': {'route_metric4': 65535}}, {'name': 'bond0.0', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test1', 'controller': 'bond0'}, {'name': 'bond0.1', 'state': 'up', 'type': 'ethernet', 'interface_name': 'test2', 'controller': 'bond0'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882186.5117517-12293-76254214350970/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882187.30187: _low_level_execute_command(): starting 11579 1726882187.30189: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882186.5117517-12293-76254214350970/ > /dev/null 2>&1 && sleep 0' 11579 1726882187.30807: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882187.30891: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882187.30913: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882187.30933: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882187.31002: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882187.32911: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882187.32922: stdout chunk (state=3): >>><<< 11579 1726882187.32936: stderr chunk (state=3): >>><<< 11579 1726882187.32964: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882187.33098: handler run complete 11579 1726882187.33101: attempt loop complete, returning result 11579 1726882187.33104: _execute() done 11579 1726882187.33106: dumping result to json 11579 1726882187.33108: done dumping result, returning 11579 1726882187.33110: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-f197-7423-000000000036] 11579 1726882187.33112: sending task result for task 12673a56-9f93-f197-7423-000000000036 changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: [007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 93b53e62-64f5-4c39-966c-031f30f8befe [008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, f55bc258-3187-4e12-b27d-0cad9097ebf7 [009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 29d61a64-4b27-4cf7-b22f-65c039402cbb [010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 93b53e62-64f5-4c39-966c-031f30f8befe (is-modified) [011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, f55bc258-3187-4e12-b27d-0cad9097ebf7 (not-active) [012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 29d61a64-4b27-4cf7-b22f-65c039402cbb (not-active) 11579 1726882187.33533: no more pending results, returning what we have 11579 1726882187.33537: results queue empty 11579 1726882187.33538: checking for any_errors_fatal 11579 1726882187.33544: done checking for any_errors_fatal 11579 1726882187.33545: checking for max_fail_percentage 11579 1726882187.33546: done checking for max_fail_percentage 11579 1726882187.33547: checking to see if all hosts have failed and the running result is not ok 11579 1726882187.33548: done checking to see if all hosts have failed 11579 1726882187.33549: getting the remaining hosts for this loop 11579 1726882187.33551: done getting the remaining hosts for this loop 11579 1726882187.33554: getting the next task for host managed_node1 11579 1726882187.33560: done getting next task for host managed_node1 11579 1726882187.33564: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11579 1726882187.33566: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882187.33577: getting variables 11579 1726882187.33578: in VariableManager get_vars() 11579 1726882187.33632: Calling all_inventory to load vars for managed_node1 11579 1726882187.33635: Calling groups_inventory to load vars for managed_node1 11579 1726882187.33638: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882187.33644: done sending task result for task 12673a56-9f93-f197-7423-000000000036 11579 1726882187.33647: WORKER PROCESS EXITING 11579 1726882187.33657: Calling all_plugins_play to load vars for managed_node1 11579 1726882187.33660: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882187.33663: Calling groups_plugins_play to load vars for managed_node1 11579 1726882187.35543: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882187.37203: done with get_vars() 11579 1726882187.37227: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:29:47 -0400 (0:00:01.019) 0:00:16.081 ****** 11579 1726882187.37322: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11579 1726882187.37324: Creating lock for fedora.linux_system_roles.network_state 11579 1726882187.37735: worker is 1 (out of 1 available) 11579 1726882187.37747: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11579 1726882187.37757: done queuing things up, now waiting for results queue to drain 11579 1726882187.37758: waiting for pending results... 11579 1726882187.37988: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 11579 1726882187.38130: in run() - task 12673a56-9f93-f197-7423-000000000037 11579 1726882187.38153: variable 'ansible_search_path' from source: unknown 11579 1726882187.38161: variable 'ansible_search_path' from source: unknown 11579 1726882187.38205: calling self._execute() 11579 1726882187.38320: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882187.38324: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882187.38354: variable 'omit' from source: magic vars 11579 1726882187.38727: variable 'ansible_distribution_major_version' from source: facts 11579 1726882187.38745: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882187.38898: variable 'network_state' from source: role '' defaults 11579 1726882187.38901: Evaluated conditional (network_state != {}): False 11579 1726882187.38907: when evaluation is False, skipping this task 11579 1726882187.39002: _execute() done 11579 1726882187.39005: dumping result to json 11579 1726882187.39008: done dumping result, returning 11579 1726882187.39010: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-f197-7423-000000000037] 11579 1726882187.39012: sending task result for task 12673a56-9f93-f197-7423-000000000037 11579 1726882187.39080: done sending task result for task 12673a56-9f93-f197-7423-000000000037 11579 1726882187.39083: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11579 1726882187.39143: no more pending results, returning what we have 11579 1726882187.39147: results queue empty 11579 1726882187.39148: checking for any_errors_fatal 11579 1726882187.39162: done checking for any_errors_fatal 11579 1726882187.39163: checking for max_fail_percentage 11579 1726882187.39164: done checking for max_fail_percentage 11579 1726882187.39165: checking to see if all hosts have failed and the running result is not ok 11579 1726882187.39167: done checking to see if all hosts have failed 11579 1726882187.39167: getting the remaining hosts for this loop 11579 1726882187.39169: done getting the remaining hosts for this loop 11579 1726882187.39173: getting the next task for host managed_node1 11579 1726882187.39180: done getting next task for host managed_node1 11579 1726882187.39183: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11579 1726882187.39187: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882187.39204: getting variables 11579 1726882187.39206: in VariableManager get_vars() 11579 1726882187.39249: Calling all_inventory to load vars for managed_node1 11579 1726882187.39252: Calling groups_inventory to load vars for managed_node1 11579 1726882187.39254: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882187.39266: Calling all_plugins_play to load vars for managed_node1 11579 1726882187.39269: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882187.39271: Calling groups_plugins_play to load vars for managed_node1 11579 1726882187.40519: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882187.41371: done with get_vars() 11579 1726882187.41385: done getting variables 11579 1726882187.41427: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:29:47 -0400 (0:00:00.041) 0:00:16.122 ****** 11579 1726882187.41450: entering _queue_task() for managed_node1/debug 11579 1726882187.41651: worker is 1 (out of 1 available) 11579 1726882187.41666: exiting _queue_task() for managed_node1/debug 11579 1726882187.41676: done queuing things up, now waiting for results queue to drain 11579 1726882187.41677: waiting for pending results... 11579 1726882187.41862: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11579 1726882187.41974: in run() - task 12673a56-9f93-f197-7423-000000000038 11579 1726882187.42014: variable 'ansible_search_path' from source: unknown 11579 1726882187.42018: variable 'ansible_search_path' from source: unknown 11579 1726882187.42035: calling self._execute() 11579 1726882187.42132: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882187.42136: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882187.42298: variable 'omit' from source: magic vars 11579 1726882187.42496: variable 'ansible_distribution_major_version' from source: facts 11579 1726882187.42510: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882187.42515: variable 'omit' from source: magic vars 11579 1726882187.42572: variable 'omit' from source: magic vars 11579 1726882187.42613: variable 'omit' from source: magic vars 11579 1726882187.42648: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882187.42747: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882187.42750: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882187.42753: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882187.42755: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882187.42764: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882187.42767: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882187.42771: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882187.42967: Set connection var ansible_timeout to 10 11579 1726882187.42970: Set connection var ansible_shell_type to sh 11579 1726882187.42973: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882187.42975: Set connection var ansible_shell_executable to /bin/sh 11579 1726882187.42978: Set connection var ansible_pipelining to False 11579 1726882187.42980: Set connection var ansible_connection to ssh 11579 1726882187.42982: variable 'ansible_shell_executable' from source: unknown 11579 1726882187.42984: variable 'ansible_connection' from source: unknown 11579 1726882187.42986: variable 'ansible_module_compression' from source: unknown 11579 1726882187.42988: variable 'ansible_shell_type' from source: unknown 11579 1726882187.42991: variable 'ansible_shell_executable' from source: unknown 11579 1726882187.42995: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882187.42997: variable 'ansible_pipelining' from source: unknown 11579 1726882187.42999: variable 'ansible_timeout' from source: unknown 11579 1726882187.43001: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882187.43079: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882187.43090: variable 'omit' from source: magic vars 11579 1726882187.43096: starting attempt loop 11579 1726882187.43102: running the handler 11579 1726882187.43238: variable '__network_connections_result' from source: set_fact 11579 1726882187.43304: handler run complete 11579 1726882187.43321: attempt loop complete, returning result 11579 1726882187.43324: _execute() done 11579 1726882187.43326: dumping result to json 11579 1726882187.43329: done dumping result, returning 11579 1726882187.43387: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-f197-7423-000000000038] 11579 1726882187.43391: sending task result for task 12673a56-9f93-f197-7423-000000000038 11579 1726882187.43461: done sending task result for task 12673a56-9f93-f197-7423-000000000038 11579 1726882187.43464: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 93b53e62-64f5-4c39-966c-031f30f8befe", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, f55bc258-3187-4e12-b27d-0cad9097ebf7", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 29d61a64-4b27-4cf7-b22f-65c039402cbb", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 93b53e62-64f5-4c39-966c-031f30f8befe (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, f55bc258-3187-4e12-b27d-0cad9097ebf7 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 29d61a64-4b27-4cf7-b22f-65c039402cbb (not-active)" ] } 11579 1726882187.43556: no more pending results, returning what we have 11579 1726882187.43558: results queue empty 11579 1726882187.43559: checking for any_errors_fatal 11579 1726882187.43562: done checking for any_errors_fatal 11579 1726882187.43563: checking for max_fail_percentage 11579 1726882187.43565: done checking for max_fail_percentage 11579 1726882187.43565: checking to see if all hosts have failed and the running result is not ok 11579 1726882187.43566: done checking to see if all hosts have failed 11579 1726882187.43567: getting the remaining hosts for this loop 11579 1726882187.43568: done getting the remaining hosts for this loop 11579 1726882187.43571: getting the next task for host managed_node1 11579 1726882187.43575: done getting next task for host managed_node1 11579 1726882187.43578: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11579 1726882187.43580: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882187.43589: getting variables 11579 1726882187.43590: in VariableManager get_vars() 11579 1726882187.43626: Calling all_inventory to load vars for managed_node1 11579 1726882187.43629: Calling groups_inventory to load vars for managed_node1 11579 1726882187.43631: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882187.43638: Calling all_plugins_play to load vars for managed_node1 11579 1726882187.43641: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882187.43643: Calling groups_plugins_play to load vars for managed_node1 11579 1726882187.44929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882187.45772: done with get_vars() 11579 1726882187.45785: done getting variables 11579 1726882187.45831: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:29:47 -0400 (0:00:00.044) 0:00:16.167 ****** 11579 1726882187.45856: entering _queue_task() for managed_node1/debug 11579 1726882187.46063: worker is 1 (out of 1 available) 11579 1726882187.46075: exiting _queue_task() for managed_node1/debug 11579 1726882187.46089: done queuing things up, now waiting for results queue to drain 11579 1726882187.46091: waiting for pending results... 11579 1726882187.46253: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11579 1726882187.46333: in run() - task 12673a56-9f93-f197-7423-000000000039 11579 1726882187.46345: variable 'ansible_search_path' from source: unknown 11579 1726882187.46349: variable 'ansible_search_path' from source: unknown 11579 1726882187.46376: calling self._execute() 11579 1726882187.46449: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882187.46454: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882187.46463: variable 'omit' from source: magic vars 11579 1726882187.47001: variable 'ansible_distribution_major_version' from source: facts 11579 1726882187.47004: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882187.47007: variable 'omit' from source: magic vars 11579 1726882187.47008: variable 'omit' from source: magic vars 11579 1726882187.47011: variable 'omit' from source: magic vars 11579 1726882187.47013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882187.47027: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882187.47050: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882187.47069: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882187.47084: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882187.47120: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882187.47139: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882187.47146: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882187.47258: Set connection var ansible_timeout to 10 11579 1726882187.47271: Set connection var ansible_shell_type to sh 11579 1726882187.47284: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882187.47299: Set connection var ansible_shell_executable to /bin/sh 11579 1726882187.47313: Set connection var ansible_pipelining to False 11579 1726882187.47320: Set connection var ansible_connection to ssh 11579 1726882187.47353: variable 'ansible_shell_executable' from source: unknown 11579 1726882187.47361: variable 'ansible_connection' from source: unknown 11579 1726882187.47368: variable 'ansible_module_compression' from source: unknown 11579 1726882187.47460: variable 'ansible_shell_type' from source: unknown 11579 1726882187.47463: variable 'ansible_shell_executable' from source: unknown 11579 1726882187.47465: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882187.47467: variable 'ansible_pipelining' from source: unknown 11579 1726882187.47469: variable 'ansible_timeout' from source: unknown 11579 1726882187.47471: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882187.47555: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882187.47574: variable 'omit' from source: magic vars 11579 1726882187.47577: starting attempt loop 11579 1726882187.47580: running the handler 11579 1726882187.47620: variable '__network_connections_result' from source: set_fact 11579 1726882187.47676: variable '__network_connections_result' from source: set_fact 11579 1726882187.47795: handler run complete 11579 1726882187.47809: attempt loop complete, returning result 11579 1726882187.47812: _execute() done 11579 1726882187.47814: dumping result to json 11579 1726882187.47820: done dumping result, returning 11579 1726882187.47828: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-f197-7423-000000000039] 11579 1726882187.47832: sending task result for task 12673a56-9f93-f197-7423-000000000039 ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "bond": { "miimon": 110, "mode": "active-backup" }, "interface_name": "nm-bond", "ip": { "route_metric4": 65535 }, "name": "bond0", "state": "up", "type": "bond" }, { "controller": "bond0", "interface_name": "test1", "name": "bond0.0", "state": "up", "type": "ethernet" }, { "controller": "bond0", "interface_name": "test2", "name": "bond0.1", "state": "up", "type": "ethernet" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 93b53e62-64f5-4c39-966c-031f30f8befe\n[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, f55bc258-3187-4e12-b27d-0cad9097ebf7\n[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 29d61a64-4b27-4cf7-b22f-65c039402cbb\n[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 93b53e62-64f5-4c39-966c-031f30f8befe (is-modified)\n[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, f55bc258-3187-4e12-b27d-0cad9097ebf7 (not-active)\n[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 29d61a64-4b27-4cf7-b22f-65c039402cbb (not-active)\n", "stderr_lines": [ "[007] #0, state:up persistent_state:present, 'bond0': add connection bond0, 93b53e62-64f5-4c39-966c-031f30f8befe", "[008] #1, state:up persistent_state:present, 'bond0.0': add connection bond0.0, f55bc258-3187-4e12-b27d-0cad9097ebf7", "[009] #2, state:up persistent_state:present, 'bond0.1': add connection bond0.1, 29d61a64-4b27-4cf7-b22f-65c039402cbb", "[010] #0, state:up persistent_state:present, 'bond0': up connection bond0, 93b53e62-64f5-4c39-966c-031f30f8befe (is-modified)", "[011] #1, state:up persistent_state:present, 'bond0.0': up connection bond0.0, f55bc258-3187-4e12-b27d-0cad9097ebf7 (not-active)", "[012] #2, state:up persistent_state:present, 'bond0.1': up connection bond0.1, 29d61a64-4b27-4cf7-b22f-65c039402cbb (not-active)" ] } } 11579 1726882187.48024: no more pending results, returning what we have 11579 1726882187.48026: results queue empty 11579 1726882187.48027: checking for any_errors_fatal 11579 1726882187.48032: done checking for any_errors_fatal 11579 1726882187.48032: checking for max_fail_percentage 11579 1726882187.48038: done checking for max_fail_percentage 11579 1726882187.48039: checking to see if all hosts have failed and the running result is not ok 11579 1726882187.48040: done checking to see if all hosts have failed 11579 1726882187.48041: getting the remaining hosts for this loop 11579 1726882187.48042: done getting the remaining hosts for this loop 11579 1726882187.48045: getting the next task for host managed_node1 11579 1726882187.48050: done getting next task for host managed_node1 11579 1726882187.48053: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11579 1726882187.48055: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882187.48064: getting variables 11579 1726882187.48066: in VariableManager get_vars() 11579 1726882187.48106: Calling all_inventory to load vars for managed_node1 11579 1726882187.48109: Calling groups_inventory to load vars for managed_node1 11579 1726882187.48111: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882187.48117: done sending task result for task 12673a56-9f93-f197-7423-000000000039 11579 1726882187.48119: WORKER PROCESS EXITING 11579 1726882187.48126: Calling all_plugins_play to load vars for managed_node1 11579 1726882187.48128: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882187.48130: Calling groups_plugins_play to load vars for managed_node1 11579 1726882187.48849: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882187.49800: done with get_vars() 11579 1726882187.49814: done getting variables 11579 1726882187.49854: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:29:47 -0400 (0:00:00.040) 0:00:16.207 ****** 11579 1726882187.49875: entering _queue_task() for managed_node1/debug 11579 1726882187.50081: worker is 1 (out of 1 available) 11579 1726882187.50098: exiting _queue_task() for managed_node1/debug 11579 1726882187.50110: done queuing things up, now waiting for results queue to drain 11579 1726882187.50111: waiting for pending results... 11579 1726882187.50266: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11579 1726882187.50348: in run() - task 12673a56-9f93-f197-7423-00000000003a 11579 1726882187.50360: variable 'ansible_search_path' from source: unknown 11579 1726882187.50363: variable 'ansible_search_path' from source: unknown 11579 1726882187.50390: calling self._execute() 11579 1726882187.50457: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882187.50461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882187.50472: variable 'omit' from source: magic vars 11579 1726882187.50724: variable 'ansible_distribution_major_version' from source: facts 11579 1726882187.50733: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882187.50815: variable 'network_state' from source: role '' defaults 11579 1726882187.50823: Evaluated conditional (network_state != {}): False 11579 1726882187.50826: when evaluation is False, skipping this task 11579 1726882187.50829: _execute() done 11579 1726882187.50831: dumping result to json 11579 1726882187.50835: done dumping result, returning 11579 1726882187.50842: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-f197-7423-00000000003a] 11579 1726882187.50847: sending task result for task 12673a56-9f93-f197-7423-00000000003a 11579 1726882187.50931: done sending task result for task 12673a56-9f93-f197-7423-00000000003a 11579 1726882187.50934: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 11579 1726882187.50976: no more pending results, returning what we have 11579 1726882187.50980: results queue empty 11579 1726882187.50980: checking for any_errors_fatal 11579 1726882187.50991: done checking for any_errors_fatal 11579 1726882187.50991: checking for max_fail_percentage 11579 1726882187.50997: done checking for max_fail_percentage 11579 1726882187.50998: checking to see if all hosts have failed and the running result is not ok 11579 1726882187.50999: done checking to see if all hosts have failed 11579 1726882187.50999: getting the remaining hosts for this loop 11579 1726882187.51001: done getting the remaining hosts for this loop 11579 1726882187.51004: getting the next task for host managed_node1 11579 1726882187.51010: done getting next task for host managed_node1 11579 1726882187.51013: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11579 1726882187.51016: ^ state is: HOST STATE: block=2, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882187.51027: getting variables 11579 1726882187.51028: in VariableManager get_vars() 11579 1726882187.51060: Calling all_inventory to load vars for managed_node1 11579 1726882187.51062: Calling groups_inventory to load vars for managed_node1 11579 1726882187.51064: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882187.51072: Calling all_plugins_play to load vars for managed_node1 11579 1726882187.51074: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882187.51077: Calling groups_plugins_play to load vars for managed_node1 11579 1726882187.51805: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882187.52665: done with get_vars() 11579 1726882187.52680: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:29:47 -0400 (0:00:00.028) 0:00:16.235 ****** 11579 1726882187.52747: entering _queue_task() for managed_node1/ping 11579 1726882187.52748: Creating lock for ping 11579 1726882187.52951: worker is 1 (out of 1 available) 11579 1726882187.52964: exiting _queue_task() for managed_node1/ping 11579 1726882187.52976: done queuing things up, now waiting for results queue to drain 11579 1726882187.52977: waiting for pending results... 11579 1726882187.53137: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 11579 1726882187.53253: in run() - task 12673a56-9f93-f197-7423-00000000003b 11579 1726882187.53265: variable 'ansible_search_path' from source: unknown 11579 1726882187.53268: variable 'ansible_search_path' from source: unknown 11579 1726882187.53301: calling self._execute() 11579 1726882187.53363: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882187.53369: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882187.53378: variable 'omit' from source: magic vars 11579 1726882187.53625: variable 'ansible_distribution_major_version' from source: facts 11579 1726882187.53635: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882187.53640: variable 'omit' from source: magic vars 11579 1726882187.53675: variable 'omit' from source: magic vars 11579 1726882187.53705: variable 'omit' from source: magic vars 11579 1726882187.53736: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882187.53762: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882187.53776: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882187.53798: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882187.53801: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882187.53825: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882187.53828: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882187.53830: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882187.53898: Set connection var ansible_timeout to 10 11579 1726882187.53901: Set connection var ansible_shell_type to sh 11579 1726882187.53908: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882187.53915: Set connection var ansible_shell_executable to /bin/sh 11579 1726882187.53920: Set connection var ansible_pipelining to False 11579 1726882187.53922: Set connection var ansible_connection to ssh 11579 1726882187.53937: variable 'ansible_shell_executable' from source: unknown 11579 1726882187.53940: variable 'ansible_connection' from source: unknown 11579 1726882187.53943: variable 'ansible_module_compression' from source: unknown 11579 1726882187.53945: variable 'ansible_shell_type' from source: unknown 11579 1726882187.53947: variable 'ansible_shell_executable' from source: unknown 11579 1726882187.53949: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882187.53954: variable 'ansible_pipelining' from source: unknown 11579 1726882187.53956: variable 'ansible_timeout' from source: unknown 11579 1726882187.53960: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882187.54100: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11579 1726882187.54106: variable 'omit' from source: magic vars 11579 1726882187.54111: starting attempt loop 11579 1726882187.54114: running the handler 11579 1726882187.54126: _low_level_execute_command(): starting 11579 1726882187.54133: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882187.54641: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882187.54645: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882187.54647: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882187.54649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882187.54701: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882187.54705: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882187.54714: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882187.54766: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882187.56419: stdout chunk (state=3): >>>/root <<< 11579 1726882187.56519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882187.56547: stderr chunk (state=3): >>><<< 11579 1726882187.56550: stdout chunk (state=3): >>><<< 11579 1726882187.56569: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882187.56579: _low_level_execute_command(): starting 11579 1726882187.56584: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882187.5656757-12335-3327023434006 `" && echo ansible-tmp-1726882187.5656757-12335-3327023434006="` echo /root/.ansible/tmp/ansible-tmp-1726882187.5656757-12335-3327023434006 `" ) && sleep 0' 11579 1726882187.56976: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882187.57015: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882187.57019: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882187.57027: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 11579 1726882187.57031: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882187.57067: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882187.57070: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882187.57122: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882187.59006: stdout chunk (state=3): >>>ansible-tmp-1726882187.5656757-12335-3327023434006=/root/.ansible/tmp/ansible-tmp-1726882187.5656757-12335-3327023434006 <<< 11579 1726882187.59152: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882187.59155: stdout chunk (state=3): >>><<< 11579 1726882187.59157: stderr chunk (state=3): >>><<< 11579 1726882187.59173: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882187.5656757-12335-3327023434006=/root/.ansible/tmp/ansible-tmp-1726882187.5656757-12335-3327023434006 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882187.59403: variable 'ansible_module_compression' from source: unknown 11579 1726882187.59406: ANSIBALLZ: Using lock for ping 11579 1726882187.59408: ANSIBALLZ: Acquiring lock 11579 1726882187.59410: ANSIBALLZ: Lock acquired: 139873761206368 11579 1726882187.59412: ANSIBALLZ: Creating module 11579 1726882187.70926: ANSIBALLZ: Writing module into payload 11579 1726882187.71000: ANSIBALLZ: Writing module 11579 1726882187.71026: ANSIBALLZ: Renaming module 11579 1726882187.71037: ANSIBALLZ: Done creating module 11579 1726882187.71057: variable 'ansible_facts' from source: unknown 11579 1726882187.71143: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882187.5656757-12335-3327023434006/AnsiballZ_ping.py 11579 1726882187.71327: Sending initial data 11579 1726882187.71336: Sent initial data (151 bytes) 11579 1726882187.71938: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882187.71953: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882187.71978: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882187.72014: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882187.72085: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882187.72134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882187.72155: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882187.72317: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882187.73899: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882187.73937: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882187.74059: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpy9q1aqqm /root/.ansible/tmp/ansible-tmp-1726882187.5656757-12335-3327023434006/AnsiballZ_ping.py <<< 11579 1726882187.74062: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882187.5656757-12335-3327023434006/AnsiballZ_ping.py" <<< 11579 1726882187.74099: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpy9q1aqqm" to remote "/root/.ansible/tmp/ansible-tmp-1726882187.5656757-12335-3327023434006/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882187.5656757-12335-3327023434006/AnsiballZ_ping.py" <<< 11579 1726882187.75342: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882187.75352: stdout chunk (state=3): >>><<< 11579 1726882187.75373: stderr chunk (state=3): >>><<< 11579 1726882187.75433: done transferring module to remote 11579 1726882187.75450: _low_level_execute_command(): starting 11579 1726882187.75460: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882187.5656757-12335-3327023434006/ /root/.ansible/tmp/ansible-tmp-1726882187.5656757-12335-3327023434006/AnsiballZ_ping.py && sleep 0' 11579 1726882187.76076: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882187.76142: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882187.76206: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882187.76220: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882187.76248: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882187.76320: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882187.78097: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882187.78101: stdout chunk (state=3): >>><<< 11579 1726882187.78103: stderr chunk (state=3): >>><<< 11579 1726882187.78202: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882187.78206: _low_level_execute_command(): starting 11579 1726882187.78208: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882187.5656757-12335-3327023434006/AnsiballZ_ping.py && sleep 0' 11579 1726882187.78791: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882187.78811: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882187.78828: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882187.78860: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882187.78960: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882187.79011: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882187.79076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882187.93843: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11579 1726882187.95204: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882187.95208: stdout chunk (state=3): >>><<< 11579 1726882187.95411: stderr chunk (state=3): >>><<< 11579 1726882187.95415: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882187.95418: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882187.5656757-12335-3327023434006/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882187.95421: _low_level_execute_command(): starting 11579 1726882187.95423: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882187.5656757-12335-3327023434006/ > /dev/null 2>&1 && sleep 0' 11579 1726882187.96038: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882187.96043: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882187.96064: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882187.96067: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882187.96123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882187.96127: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882187.96178: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882187.98071: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882187.98074: stdout chunk (state=3): >>><<< 11579 1726882187.98076: stderr chunk (state=3): >>><<< 11579 1726882187.98301: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882187.98305: handler run complete 11579 1726882187.98308: attempt loop complete, returning result 11579 1726882187.98310: _execute() done 11579 1726882187.98312: dumping result to json 11579 1726882187.98315: done dumping result, returning 11579 1726882187.98317: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-f197-7423-00000000003b] 11579 1726882187.98319: sending task result for task 12673a56-9f93-f197-7423-00000000003b 11579 1726882187.98387: done sending task result for task 12673a56-9f93-f197-7423-00000000003b 11579 1726882187.98390: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "ping": "pong" } 11579 1726882187.98468: no more pending results, returning what we have 11579 1726882187.98474: results queue empty 11579 1726882187.98475: checking for any_errors_fatal 11579 1726882187.98482: done checking for any_errors_fatal 11579 1726882187.98482: checking for max_fail_percentage 11579 1726882187.98484: done checking for max_fail_percentage 11579 1726882187.98485: checking to see if all hosts have failed and the running result is not ok 11579 1726882187.98486: done checking to see if all hosts have failed 11579 1726882187.98486: getting the remaining hosts for this loop 11579 1726882187.98488: done getting the remaining hosts for this loop 11579 1726882187.98491: getting the next task for host managed_node1 11579 1726882187.98503: done getting next task for host managed_node1 11579 1726882187.98505: ^ task is: TASK: meta (role_complete) 11579 1726882187.98507: ^ state is: HOST STATE: block=2, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882187.98632: getting variables 11579 1726882187.98634: in VariableManager get_vars() 11579 1726882187.98673: Calling all_inventory to load vars for managed_node1 11579 1726882187.98676: Calling groups_inventory to load vars for managed_node1 11579 1726882187.98678: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882187.98689: Calling all_plugins_play to load vars for managed_node1 11579 1726882187.98692: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882187.98812: Calling groups_plugins_play to load vars for managed_node1 11579 1726882188.00788: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882188.03227: done with get_vars() 11579 1726882188.03250: done getting variables 11579 1726882188.03340: done queuing things up, now waiting for results queue to drain 11579 1726882188.03342: results queue empty 11579 1726882188.03343: checking for any_errors_fatal 11579 1726882188.03345: done checking for any_errors_fatal 11579 1726882188.03346: checking for max_fail_percentage 11579 1726882188.03347: done checking for max_fail_percentage 11579 1726882188.03348: checking to see if all hosts have failed and the running result is not ok 11579 1726882188.03349: done checking to see if all hosts have failed 11579 1726882188.03349: getting the remaining hosts for this loop 11579 1726882188.03350: done getting the remaining hosts for this loop 11579 1726882188.03353: getting the next task for host managed_node1 11579 1726882188.03357: done getting next task for host managed_node1 11579 1726882188.03364: ^ task is: TASK: Include the task 'get_interface_stat.yml' 11579 1726882188.03366: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882188.03368: getting variables 11579 1726882188.03369: in VariableManager get_vars() 11579 1726882188.03382: Calling all_inventory to load vars for managed_node1 11579 1726882188.03384: Calling groups_inventory to load vars for managed_node1 11579 1726882188.03386: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882188.03391: Calling all_plugins_play to load vars for managed_node1 11579 1726882188.03395: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882188.03398: Calling groups_plugins_play to load vars for managed_node1 11579 1726882188.04629: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882188.06325: done with get_vars() 11579 1726882188.06349: done getting variables TASK [Include the task 'get_interface_stat.yml'] ******************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:3 Friday 20 September 2024 21:29:48 -0400 (0:00:00.536) 0:00:16.772 ****** 11579 1726882188.06443: entering _queue_task() for managed_node1/include_tasks 11579 1726882188.06817: worker is 1 (out of 1 available) 11579 1726882188.06836: exiting _queue_task() for managed_node1/include_tasks 11579 1726882188.06848: done queuing things up, now waiting for results queue to drain 11579 1726882188.06849: waiting for pending results... 11579 1726882188.07141: running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' 11579 1726882188.07257: in run() - task 12673a56-9f93-f197-7423-00000000006e 11579 1726882188.07280: variable 'ansible_search_path' from source: unknown 11579 1726882188.07284: variable 'ansible_search_path' from source: unknown 11579 1726882188.07328: calling self._execute() 11579 1726882188.07419: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882188.07425: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882188.07434: variable 'omit' from source: magic vars 11579 1726882188.07824: variable 'ansible_distribution_major_version' from source: facts 11579 1726882188.07899: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882188.07902: _execute() done 11579 1726882188.07904: dumping result to json 11579 1726882188.07906: done dumping result, returning 11579 1726882188.07908: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_interface_stat.yml' [12673a56-9f93-f197-7423-00000000006e] 11579 1726882188.07909: sending task result for task 12673a56-9f93-f197-7423-00000000006e 11579 1726882188.07974: done sending task result for task 12673a56-9f93-f197-7423-00000000006e 11579 1726882188.07978: WORKER PROCESS EXITING 11579 1726882188.08007: no more pending results, returning what we have 11579 1726882188.08013: in VariableManager get_vars() 11579 1726882188.08175: Calling all_inventory to load vars for managed_node1 11579 1726882188.08179: Calling groups_inventory to load vars for managed_node1 11579 1726882188.08181: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882188.08192: Calling all_plugins_play to load vars for managed_node1 11579 1726882188.08197: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882188.08200: Calling groups_plugins_play to load vars for managed_node1 11579 1726882188.09737: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882188.11236: done with get_vars() 11579 1726882188.11257: variable 'ansible_search_path' from source: unknown 11579 1726882188.11258: variable 'ansible_search_path' from source: unknown 11579 1726882188.11302: we have included files to process 11579 1726882188.11304: generating all_blocks data 11579 1726882188.11305: done generating all_blocks data 11579 1726882188.11310: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11579 1726882188.11311: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11579 1726882188.11313: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml 11579 1726882188.11504: done processing included file 11579 1726882188.11506: iterating over new_blocks loaded from include file 11579 1726882188.11508: in VariableManager get_vars() 11579 1726882188.11527: done with get_vars() 11579 1726882188.11529: filtering new block on tags 11579 1726882188.11547: done filtering new block on tags 11579 1726882188.11549: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml for managed_node1 11579 1726882188.11554: extending task lists for all hosts with included blocks 11579 1726882188.11654: done extending task lists 11579 1726882188.11655: done processing included files 11579 1726882188.11656: results queue empty 11579 1726882188.11656: checking for any_errors_fatal 11579 1726882188.11658: done checking for any_errors_fatal 11579 1726882188.11659: checking for max_fail_percentage 11579 1726882188.11660: done checking for max_fail_percentage 11579 1726882188.11661: checking to see if all hosts have failed and the running result is not ok 11579 1726882188.11661: done checking to see if all hosts have failed 11579 1726882188.11662: getting the remaining hosts for this loop 11579 1726882188.11663: done getting the remaining hosts for this loop 11579 1726882188.11666: getting the next task for host managed_node1 11579 1726882188.11670: done getting next task for host managed_node1 11579 1726882188.11672: ^ task is: TASK: Get stat for interface {{ interface }} 11579 1726882188.11675: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882188.11677: getting variables 11579 1726882188.11678: in VariableManager get_vars() 11579 1726882188.11691: Calling all_inventory to load vars for managed_node1 11579 1726882188.11698: Calling groups_inventory to load vars for managed_node1 11579 1726882188.11700: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882188.11705: Calling all_plugins_play to load vars for managed_node1 11579 1726882188.11707: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882188.11710: Calling groups_plugins_play to load vars for managed_node1 11579 1726882188.12861: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882188.14355: done with get_vars() 11579 1726882188.14380: done getting variables 11579 1726882188.14559: variable 'interface' from source: task vars 11579 1726882188.14563: variable 'controller_device' from source: play vars 11579 1726882188.14629: variable 'controller_device' from source: play vars TASK [Get stat for interface nm-bond] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_interface_stat.yml:3 Friday 20 September 2024 21:29:48 -0400 (0:00:00.082) 0:00:16.855 ****** 11579 1726882188.14663: entering _queue_task() for managed_node1/stat 11579 1726882188.15028: worker is 1 (out of 1 available) 11579 1726882188.15041: exiting _queue_task() for managed_node1/stat 11579 1726882188.15054: done queuing things up, now waiting for results queue to drain 11579 1726882188.15055: waiting for pending results... 11579 1726882188.15423: running TaskExecutor() for managed_node1/TASK: Get stat for interface nm-bond 11579 1726882188.15503: in run() - task 12673a56-9f93-f197-7423-000000000241 11579 1726882188.15530: variable 'ansible_search_path' from source: unknown 11579 1726882188.15539: variable 'ansible_search_path' from source: unknown 11579 1726882188.15579: calling self._execute() 11579 1726882188.15675: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882188.15736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882188.15740: variable 'omit' from source: magic vars 11579 1726882188.16072: variable 'ansible_distribution_major_version' from source: facts 11579 1726882188.16088: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882188.16103: variable 'omit' from source: magic vars 11579 1726882188.16156: variable 'omit' from source: magic vars 11579 1726882188.16256: variable 'interface' from source: task vars 11579 1726882188.16266: variable 'controller_device' from source: play vars 11579 1726882188.16340: variable 'controller_device' from source: play vars 11579 1726882188.16388: variable 'omit' from source: magic vars 11579 1726882188.16412: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882188.16450: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882188.16474: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882188.16699: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882188.16703: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882188.16705: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882188.16708: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882188.16710: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882188.16711: Set connection var ansible_timeout to 10 11579 1726882188.16713: Set connection var ansible_shell_type to sh 11579 1726882188.16715: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882188.16717: Set connection var ansible_shell_executable to /bin/sh 11579 1726882188.16719: Set connection var ansible_pipelining to False 11579 1726882188.16722: Set connection var ansible_connection to ssh 11579 1726882188.16724: variable 'ansible_shell_executable' from source: unknown 11579 1726882188.16726: variable 'ansible_connection' from source: unknown 11579 1726882188.16727: variable 'ansible_module_compression' from source: unknown 11579 1726882188.16729: variable 'ansible_shell_type' from source: unknown 11579 1726882188.16731: variable 'ansible_shell_executable' from source: unknown 11579 1726882188.16738: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882188.16748: variable 'ansible_pipelining' from source: unknown 11579 1726882188.16757: variable 'ansible_timeout' from source: unknown 11579 1726882188.16765: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882188.16966: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11579 1726882188.16980: variable 'omit' from source: magic vars 11579 1726882188.16990: starting attempt loop 11579 1726882188.17000: running the handler 11579 1726882188.17018: _low_level_execute_command(): starting 11579 1726882188.17030: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882188.17812: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882188.17875: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882188.17897: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882188.17924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882188.18000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882188.19604: stdout chunk (state=3): >>>/root <<< 11579 1726882188.19745: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882188.19757: stdout chunk (state=3): >>><<< 11579 1726882188.19772: stderr chunk (state=3): >>><<< 11579 1726882188.19807: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882188.19827: _low_level_execute_command(): starting 11579 1726882188.19910: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882188.198144-12364-200744704567954 `" && echo ansible-tmp-1726882188.198144-12364-200744704567954="` echo /root/.ansible/tmp/ansible-tmp-1726882188.198144-12364-200744704567954 `" ) && sleep 0' 11579 1726882188.20511: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882188.20518: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882188.20529: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882188.20544: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882188.20556: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882188.20566: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882188.20572: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882188.20587: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882188.20602: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 11579 1726882188.20607: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11579 1726882188.20667: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882188.20670: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882188.20672: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882188.20674: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882188.20676: stderr chunk (state=3): >>>debug2: match found <<< 11579 1726882188.20683: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882188.20749: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882188.20752: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882188.20777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882188.20845: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882188.22700: stdout chunk (state=3): >>>ansible-tmp-1726882188.198144-12364-200744704567954=/root/.ansible/tmp/ansible-tmp-1726882188.198144-12364-200744704567954 <<< 11579 1726882188.22863: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882188.22866: stdout chunk (state=3): >>><<< 11579 1726882188.22869: stderr chunk (state=3): >>><<< 11579 1726882188.22885: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882188.198144-12364-200744704567954=/root/.ansible/tmp/ansible-tmp-1726882188.198144-12364-200744704567954 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882188.23104: variable 'ansible_module_compression' from source: unknown 11579 1726882188.23107: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11579 1726882188.23110: variable 'ansible_facts' from source: unknown 11579 1726882188.23139: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882188.198144-12364-200744704567954/AnsiballZ_stat.py 11579 1726882188.23324: Sending initial data 11579 1726882188.23333: Sent initial data (152 bytes) 11579 1726882188.24218: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882188.24234: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882188.24249: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882188.24315: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882188.24376: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882188.24398: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882188.24428: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882188.24491: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882188.26024: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882188.26088: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882188.26201: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpkiapu3fj /root/.ansible/tmp/ansible-tmp-1726882188.198144-12364-200744704567954/AnsiballZ_stat.py <<< 11579 1726882188.26207: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882188.198144-12364-200744704567954/AnsiballZ_stat.py" <<< 11579 1726882188.26232: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpkiapu3fj" to remote "/root/.ansible/tmp/ansible-tmp-1726882188.198144-12364-200744704567954/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882188.198144-12364-200744704567954/AnsiballZ_stat.py" <<< 11579 1726882188.27020: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882188.27055: stderr chunk (state=3): >>><<< 11579 1726882188.27073: stdout chunk (state=3): >>><<< 11579 1726882188.27138: done transferring module to remote 11579 1726882188.27232: _low_level_execute_command(): starting 11579 1726882188.27236: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882188.198144-12364-200744704567954/ /root/.ansible/tmp/ansible-tmp-1726882188.198144-12364-200744704567954/AnsiballZ_stat.py && sleep 0' 11579 1726882188.27826: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882188.27839: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882188.27920: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882188.27971: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882188.27990: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882188.28022: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882188.28088: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882188.30013: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882188.30016: stdout chunk (state=3): >>><<< 11579 1726882188.30018: stderr chunk (state=3): >>><<< 11579 1726882188.30022: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882188.30024: _low_level_execute_command(): starting 11579 1726882188.30026: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882188.198144-12364-200744704567954/AnsiballZ_stat.py && sleep 0' 11579 1726882188.30533: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882188.30542: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882188.30563: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882188.30577: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882188.30590: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882188.30602: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882188.30613: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882188.30627: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882188.30635: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 11579 1726882188.30642: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11579 1726882188.30669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882188.30741: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882188.30779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882188.30841: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882188.45655: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27280, "dev": 23, "nlink": 1, "atime": 1726882187.1451669, "mtime": 1726882187.1451669, "ctime": 1726882187.1451669, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11579 1726882188.46882: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882188.46886: stdout chunk (state=3): >>><<< 11579 1726882188.46889: stderr chunk (state=3): >>><<< 11579 1726882188.46912: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": true, "path": "/sys/class/net/nm-bond", "mode": "0777", "isdir": false, "ischr": false, "isblk": false, "isreg": false, "isfifo": false, "islnk": true, "issock": false, "uid": 0, "gid": 0, "size": 0, "inode": 27280, "dev": 23, "nlink": 1, "atime": 1726882187.1451669, "mtime": 1726882187.1451669, "ctime": 1726882187.1451669, "wusr": true, "rusr": true, "xusr": true, "wgrp": true, "rgrp": true, "xgrp": true, "woth": true, "roth": true, "xoth": true, "isuid": false, "isgid": false, "blocks": 0, "block_size": 4096, "device_type": 0, "readable": true, "writeable": true, "executable": true, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "pw_name": "root", "gr_name": "root"}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/sys/class/net/nm-bond", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882188.47040: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/sys/class/net/nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882188.198144-12364-200744704567954/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882188.47048: _low_level_execute_command(): starting 11579 1726882188.47051: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882188.198144-12364-200744704567954/ > /dev/null 2>&1 && sleep 0' 11579 1726882188.47618: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882188.47635: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882188.47649: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882188.47723: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882188.47773: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882188.47789: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882188.47825: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882188.47906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882188.49987: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882188.49990: stdout chunk (state=3): >>><<< 11579 1726882188.49992: stderr chunk (state=3): >>><<< 11579 1726882188.49996: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882188.49998: handler run complete 11579 1726882188.50004: attempt loop complete, returning result 11579 1726882188.50006: _execute() done 11579 1726882188.50008: dumping result to json 11579 1726882188.50010: done dumping result, returning 11579 1726882188.50012: done running TaskExecutor() for managed_node1/TASK: Get stat for interface nm-bond [12673a56-9f93-f197-7423-000000000241] 11579 1726882188.50014: sending task result for task 12673a56-9f93-f197-7423-000000000241 11579 1726882188.50092: done sending task result for task 12673a56-9f93-f197-7423-000000000241 11579 1726882188.50098: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "atime": 1726882187.1451669, "block_size": 4096, "blocks": 0, "ctime": 1726882187.1451669, "dev": 23, "device_type": 0, "executable": true, "exists": true, "gid": 0, "gr_name": "root", "inode": 27280, "isblk": false, "ischr": false, "isdir": false, "isfifo": false, "isgid": false, "islnk": true, "isreg": false, "issock": false, "isuid": false, "lnk_source": "/sys/devices/virtual/net/nm-bond", "lnk_target": "../../devices/virtual/net/nm-bond", "mode": "0777", "mtime": 1726882187.1451669, "nlink": 1, "path": "/sys/class/net/nm-bond", "pw_name": "root", "readable": true, "rgrp": true, "roth": true, "rusr": true, "size": 0, "uid": 0, "wgrp": true, "woth": true, "writeable": true, "wusr": true, "xgrp": true, "xoth": true, "xusr": true } } 11579 1726882188.50284: no more pending results, returning what we have 11579 1726882188.50287: results queue empty 11579 1726882188.50288: checking for any_errors_fatal 11579 1726882188.50289: done checking for any_errors_fatal 11579 1726882188.50289: checking for max_fail_percentage 11579 1726882188.50291: done checking for max_fail_percentage 11579 1726882188.50291: checking to see if all hosts have failed and the running result is not ok 11579 1726882188.50299: done checking to see if all hosts have failed 11579 1726882188.50299: getting the remaining hosts for this loop 11579 1726882188.50301: done getting the remaining hosts for this loop 11579 1726882188.50305: getting the next task for host managed_node1 11579 1726882188.50313: done getting next task for host managed_node1 11579 1726882188.50315: ^ task is: TASK: Assert that the interface is present - '{{ interface }}' 11579 1726882188.50318: ^ state is: HOST STATE: block=2, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882188.50322: getting variables 11579 1726882188.50324: in VariableManager get_vars() 11579 1726882188.50364: Calling all_inventory to load vars for managed_node1 11579 1726882188.50480: Calling groups_inventory to load vars for managed_node1 11579 1726882188.50483: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882188.50498: Calling all_plugins_play to load vars for managed_node1 11579 1726882188.50501: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882188.50503: Calling groups_plugins_play to load vars for managed_node1 11579 1726882188.51888: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882188.53460: done with get_vars() 11579 1726882188.53481: done getting variables 11579 1726882188.53542: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882188.53659: variable 'interface' from source: task vars 11579 1726882188.53663: variable 'controller_device' from source: play vars 11579 1726882188.53722: variable 'controller_device' from source: play vars TASK [Assert that the interface is present - 'nm-bond'] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_device_present.yml:5 Friday 20 September 2024 21:29:48 -0400 (0:00:00.390) 0:00:17.245 ****** 11579 1726882188.53754: entering _queue_task() for managed_node1/assert 11579 1726882188.54076: worker is 1 (out of 1 available) 11579 1726882188.54089: exiting _queue_task() for managed_node1/assert 11579 1726882188.54303: done queuing things up, now waiting for results queue to drain 11579 1726882188.54305: waiting for pending results... 11579 1726882188.54391: running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'nm-bond' 11579 1726882188.54515: in run() - task 12673a56-9f93-f197-7423-00000000006f 11579 1726882188.54637: variable 'ansible_search_path' from source: unknown 11579 1726882188.54640: variable 'ansible_search_path' from source: unknown 11579 1726882188.54644: calling self._execute() 11579 1726882188.54683: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882188.54699: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882188.54715: variable 'omit' from source: magic vars 11579 1726882188.55067: variable 'ansible_distribution_major_version' from source: facts 11579 1726882188.55086: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882188.55182: variable 'omit' from source: magic vars 11579 1726882188.55186: variable 'omit' from source: magic vars 11579 1726882188.55254: variable 'interface' from source: task vars 11579 1726882188.55265: variable 'controller_device' from source: play vars 11579 1726882188.55335: variable 'controller_device' from source: play vars 11579 1726882188.55358: variable 'omit' from source: magic vars 11579 1726882188.55407: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882188.55446: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882188.55471: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882188.55495: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882188.55516: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882188.55549: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882188.55557: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882188.55614: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882188.55720: Set connection var ansible_timeout to 10 11579 1726882188.55899: Set connection var ansible_shell_type to sh 11579 1726882188.55902: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882188.55904: Set connection var ansible_shell_executable to /bin/sh 11579 1726882188.55906: Set connection var ansible_pipelining to False 11579 1726882188.55909: Set connection var ansible_connection to ssh 11579 1726882188.55911: variable 'ansible_shell_executable' from source: unknown 11579 1726882188.55912: variable 'ansible_connection' from source: unknown 11579 1726882188.55914: variable 'ansible_module_compression' from source: unknown 11579 1726882188.55916: variable 'ansible_shell_type' from source: unknown 11579 1726882188.55918: variable 'ansible_shell_executable' from source: unknown 11579 1726882188.55920: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882188.55922: variable 'ansible_pipelining' from source: unknown 11579 1726882188.55924: variable 'ansible_timeout' from source: unknown 11579 1726882188.55926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882188.55966: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882188.55983: variable 'omit' from source: magic vars 11579 1726882188.55995: starting attempt loop 11579 1726882188.56004: running the handler 11579 1726882188.56134: variable 'interface_stat' from source: set_fact 11579 1726882188.56162: Evaluated conditional (interface_stat.stat.exists): True 11579 1726882188.56172: handler run complete 11579 1726882188.56189: attempt loop complete, returning result 11579 1726882188.56199: _execute() done 11579 1726882188.56206: dumping result to json 11579 1726882188.56214: done dumping result, returning 11579 1726882188.56225: done running TaskExecutor() for managed_node1/TASK: Assert that the interface is present - 'nm-bond' [12673a56-9f93-f197-7423-00000000006f] 11579 1726882188.56234: sending task result for task 12673a56-9f93-f197-7423-00000000006f ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11579 1726882188.56412: no more pending results, returning what we have 11579 1726882188.56416: results queue empty 11579 1726882188.56417: checking for any_errors_fatal 11579 1726882188.56426: done checking for any_errors_fatal 11579 1726882188.56427: checking for max_fail_percentage 11579 1726882188.56429: done checking for max_fail_percentage 11579 1726882188.56430: checking to see if all hosts have failed and the running result is not ok 11579 1726882188.56432: done checking to see if all hosts have failed 11579 1726882188.56432: getting the remaining hosts for this loop 11579 1726882188.56434: done getting the remaining hosts for this loop 11579 1726882188.56437: getting the next task for host managed_node1 11579 1726882188.56446: done getting next task for host managed_node1 11579 1726882188.56449: ^ task is: TASK: Include the task 'assert_profile_present.yml' 11579 1726882188.56451: ^ state is: HOST STATE: block=2, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882188.56455: getting variables 11579 1726882188.56457: in VariableManager get_vars() 11579 1726882188.56499: Calling all_inventory to load vars for managed_node1 11579 1726882188.56502: Calling groups_inventory to load vars for managed_node1 11579 1726882188.56505: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882188.56517: Calling all_plugins_play to load vars for managed_node1 11579 1726882188.56520: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882188.56523: Calling groups_plugins_play to load vars for managed_node1 11579 1726882188.57233: done sending task result for task 12673a56-9f93-f197-7423-00000000006f 11579 1726882188.57236: WORKER PROCESS EXITING 11579 1726882188.58115: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882188.60127: done with get_vars() 11579 1726882188.60147: done getting variables TASK [Include the task 'assert_profile_present.yml'] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:67 Friday 20 September 2024 21:29:48 -0400 (0:00:00.064) 0:00:17.310 ****** 11579 1726882188.60239: entering _queue_task() for managed_node1/include_tasks 11579 1726882188.60550: worker is 1 (out of 1 available) 11579 1726882188.60562: exiting _queue_task() for managed_node1/include_tasks 11579 1726882188.60574: done queuing things up, now waiting for results queue to drain 11579 1726882188.60575: waiting for pending results... 11579 1726882188.60955: running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' 11579 1726882188.61066: in run() - task 12673a56-9f93-f197-7423-000000000070 11579 1726882188.61095: variable 'ansible_search_path' from source: unknown 11579 1726882188.61176: variable 'controller_profile' from source: play vars 11579 1726882188.61367: variable 'controller_profile' from source: play vars 11579 1726882188.61389: variable 'port1_profile' from source: play vars 11579 1726882188.61459: variable 'port1_profile' from source: play vars 11579 1726882188.61474: variable 'port2_profile' from source: play vars 11579 1726882188.61546: variable 'port2_profile' from source: play vars 11579 1726882188.61567: variable 'omit' from source: magic vars 11579 1726882188.61705: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882188.61721: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882188.61737: variable 'omit' from source: magic vars 11579 1726882188.61955: variable 'ansible_distribution_major_version' from source: facts 11579 1726882188.61969: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882188.62027: variable 'item' from source: unknown 11579 1726882188.62068: variable 'item' from source: unknown 11579 1726882188.62355: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882188.62358: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882188.62361: variable 'omit' from source: magic vars 11579 1726882188.62400: variable 'ansible_distribution_major_version' from source: facts 11579 1726882188.62412: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882188.62442: variable 'item' from source: unknown 11579 1726882188.62508: variable 'item' from source: unknown 11579 1726882188.62800: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882188.62804: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882188.62806: variable 'omit' from source: magic vars 11579 1726882188.62808: variable 'ansible_distribution_major_version' from source: facts 11579 1726882188.62810: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882188.62840: variable 'item' from source: unknown 11579 1726882188.62910: variable 'item' from source: unknown 11579 1726882188.63182: dumping result to json 11579 1726882188.63185: done dumping result, returning 11579 1726882188.63188: done running TaskExecutor() for managed_node1/TASK: Include the task 'assert_profile_present.yml' [12673a56-9f93-f197-7423-000000000070] 11579 1726882188.63190: sending task result for task 12673a56-9f93-f197-7423-000000000070 11579 1726882188.63236: done sending task result for task 12673a56-9f93-f197-7423-000000000070 11579 1726882188.63239: WORKER PROCESS EXITING 11579 1726882188.63275: no more pending results, returning what we have 11579 1726882188.63281: in VariableManager get_vars() 11579 1726882188.63334: Calling all_inventory to load vars for managed_node1 11579 1726882188.63337: Calling groups_inventory to load vars for managed_node1 11579 1726882188.63340: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882188.63356: Calling all_plugins_play to load vars for managed_node1 11579 1726882188.63360: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882188.63364: Calling groups_plugins_play to load vars for managed_node1 11579 1726882188.65562: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882188.67056: done with get_vars() 11579 1726882188.67077: variable 'ansible_search_path' from source: unknown 11579 1726882188.67095: variable 'ansible_search_path' from source: unknown 11579 1726882188.67105: variable 'ansible_search_path' from source: unknown 11579 1726882188.67112: we have included files to process 11579 1726882188.67113: generating all_blocks data 11579 1726882188.67115: done generating all_blocks data 11579 1726882188.67120: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11579 1726882188.67121: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11579 1726882188.67123: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11579 1726882188.67320: in VariableManager get_vars() 11579 1726882188.67344: done with get_vars() 11579 1726882188.67590: done processing included file 11579 1726882188.67592: iterating over new_blocks loaded from include file 11579 1726882188.67596: in VariableManager get_vars() 11579 1726882188.67614: done with get_vars() 11579 1726882188.67616: filtering new block on tags 11579 1726882188.67637: done filtering new block on tags 11579 1726882188.67640: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=bond0) 11579 1726882188.67645: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11579 1726882188.67646: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11579 1726882188.67650: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11579 1726882188.67746: in VariableManager get_vars() 11579 1726882188.67768: done with get_vars() 11579 1726882188.67991: done processing included file 11579 1726882188.67995: iterating over new_blocks loaded from include file 11579 1726882188.67996: in VariableManager get_vars() 11579 1726882188.68013: done with get_vars() 11579 1726882188.68015: filtering new block on tags 11579 1726882188.68032: done filtering new block on tags 11579 1726882188.68034: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=bond0.0) 11579 1726882188.68038: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11579 1726882188.68038: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11579 1726882188.68041: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml 11579 1726882188.68133: in VariableManager get_vars() 11579 1726882188.68204: done with get_vars() 11579 1726882188.68418: done processing included file 11579 1726882188.68421: iterating over new_blocks loaded from include file 11579 1726882188.68422: in VariableManager get_vars() 11579 1726882188.68439: done with get_vars() 11579 1726882188.68441: filtering new block on tags 11579 1726882188.68459: done filtering new block on tags 11579 1726882188.68461: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml for managed_node1 => (item=bond0.1) 11579 1726882188.68465: extending task lists for all hosts with included blocks 11579 1726882188.71112: done extending task lists 11579 1726882188.71118: done processing included files 11579 1726882188.71119: results queue empty 11579 1726882188.71120: checking for any_errors_fatal 11579 1726882188.71123: done checking for any_errors_fatal 11579 1726882188.71124: checking for max_fail_percentage 11579 1726882188.71125: done checking for max_fail_percentage 11579 1726882188.71125: checking to see if all hosts have failed and the running result is not ok 11579 1726882188.71126: done checking to see if all hosts have failed 11579 1726882188.71127: getting the remaining hosts for this loop 11579 1726882188.71128: done getting the remaining hosts for this loop 11579 1726882188.71130: getting the next task for host managed_node1 11579 1726882188.71134: done getting next task for host managed_node1 11579 1726882188.71135: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11579 1726882188.71138: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882188.71140: getting variables 11579 1726882188.71141: in VariableManager get_vars() 11579 1726882188.71154: Calling all_inventory to load vars for managed_node1 11579 1726882188.71157: Calling groups_inventory to load vars for managed_node1 11579 1726882188.71159: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882188.71165: Calling all_plugins_play to load vars for managed_node1 11579 1726882188.71168: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882188.71171: Calling groups_plugins_play to load vars for managed_node1 11579 1726882188.81259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882188.84148: done with get_vars() 11579 1726882188.84182: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:29:48 -0400 (0:00:00.242) 0:00:17.553 ****** 11579 1726882188.84461: entering _queue_task() for managed_node1/include_tasks 11579 1726882188.84825: worker is 1 (out of 1 available) 11579 1726882188.84837: exiting _queue_task() for managed_node1/include_tasks 11579 1726882188.84851: done queuing things up, now waiting for results queue to drain 11579 1726882188.84852: waiting for pending results... 11579 1726882188.85224: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 11579 1726882188.85265: in run() - task 12673a56-9f93-f197-7423-00000000025f 11579 1726882188.85288: variable 'ansible_search_path' from source: unknown 11579 1726882188.85298: variable 'ansible_search_path' from source: unknown 11579 1726882188.85347: calling self._execute() 11579 1726882188.85452: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882188.85464: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882188.85537: variable 'omit' from source: magic vars 11579 1726882188.85868: variable 'ansible_distribution_major_version' from source: facts 11579 1726882188.85887: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882188.85902: _execute() done 11579 1726882188.85910: dumping result to json 11579 1726882188.85918: done dumping result, returning 11579 1726882188.85927: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-f197-7423-00000000025f] 11579 1726882188.85938: sending task result for task 12673a56-9f93-f197-7423-00000000025f 11579 1726882188.86223: no more pending results, returning what we have 11579 1726882188.86229: in VariableManager get_vars() 11579 1726882188.86278: Calling all_inventory to load vars for managed_node1 11579 1726882188.86281: Calling groups_inventory to load vars for managed_node1 11579 1726882188.86283: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882188.86300: Calling all_plugins_play to load vars for managed_node1 11579 1726882188.86303: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882188.86306: Calling groups_plugins_play to load vars for managed_node1 11579 1726882188.86908: done sending task result for task 12673a56-9f93-f197-7423-00000000025f 11579 1726882188.86911: WORKER PROCESS EXITING 11579 1726882188.87721: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882188.89285: done with get_vars() 11579 1726882188.89305: variable 'ansible_search_path' from source: unknown 11579 1726882188.89307: variable 'ansible_search_path' from source: unknown 11579 1726882188.89344: we have included files to process 11579 1726882188.89345: generating all_blocks data 11579 1726882188.89347: done generating all_blocks data 11579 1726882188.89348: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11579 1726882188.89349: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11579 1726882188.89352: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11579 1726882188.90344: done processing included file 11579 1726882188.90346: iterating over new_blocks loaded from include file 11579 1726882188.90348: in VariableManager get_vars() 11579 1726882188.90369: done with get_vars() 11579 1726882188.90371: filtering new block on tags 11579 1726882188.90397: done filtering new block on tags 11579 1726882188.90400: in VariableManager get_vars() 11579 1726882188.90420: done with get_vars() 11579 1726882188.90422: filtering new block on tags 11579 1726882188.90443: done filtering new block on tags 11579 1726882188.90445: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 11579 1726882188.90451: extending task lists for all hosts with included blocks 11579 1726882188.90676: done extending task lists 11579 1726882188.90678: done processing included files 11579 1726882188.90679: results queue empty 11579 1726882188.90679: checking for any_errors_fatal 11579 1726882188.90683: done checking for any_errors_fatal 11579 1726882188.90683: checking for max_fail_percentage 11579 1726882188.90684: done checking for max_fail_percentage 11579 1726882188.90685: checking to see if all hosts have failed and the running result is not ok 11579 1726882188.90686: done checking to see if all hosts have failed 11579 1726882188.90687: getting the remaining hosts for this loop 11579 1726882188.90688: done getting the remaining hosts for this loop 11579 1726882188.90690: getting the next task for host managed_node1 11579 1726882188.90697: done getting next task for host managed_node1 11579 1726882188.90699: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11579 1726882188.90702: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882188.90704: getting variables 11579 1726882188.90705: in VariableManager get_vars() 11579 1726882188.90717: Calling all_inventory to load vars for managed_node1 11579 1726882188.90719: Calling groups_inventory to load vars for managed_node1 11579 1726882188.90721: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882188.90727: Calling all_plugins_play to load vars for managed_node1 11579 1726882188.90729: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882188.90732: Calling groups_plugins_play to load vars for managed_node1 11579 1726882188.91899: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882188.93458: done with get_vars() 11579 1726882188.93481: done getting variables 11579 1726882188.93528: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:29:48 -0400 (0:00:00.090) 0:00:17.644 ****** 11579 1726882188.93560: entering _queue_task() for managed_node1/set_fact 11579 1726882188.93921: worker is 1 (out of 1 available) 11579 1726882188.93933: exiting _queue_task() for managed_node1/set_fact 11579 1726882188.93944: done queuing things up, now waiting for results queue to drain 11579 1726882188.93945: waiting for pending results... 11579 1726882188.94214: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 11579 1726882188.94600: in run() - task 12673a56-9f93-f197-7423-0000000003b0 11579 1726882188.94604: variable 'ansible_search_path' from source: unknown 11579 1726882188.94607: variable 'ansible_search_path' from source: unknown 11579 1726882188.94610: calling self._execute() 11579 1726882188.94678: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882188.94733: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882188.94749: variable 'omit' from source: magic vars 11579 1726882188.95461: variable 'ansible_distribution_major_version' from source: facts 11579 1726882188.95801: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882188.95804: variable 'omit' from source: magic vars 11579 1726882188.95806: variable 'omit' from source: magic vars 11579 1726882188.95810: variable 'omit' from source: magic vars 11579 1726882188.95851: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882188.95890: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882188.95921: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882188.96038: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882188.96058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882188.96098: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882188.96108: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882188.96116: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882188.96344: Set connection var ansible_timeout to 10 11579 1726882188.96354: Set connection var ansible_shell_type to sh 11579 1726882188.96463: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882188.96473: Set connection var ansible_shell_executable to /bin/sh 11579 1726882188.96485: Set connection var ansible_pipelining to False 11579 1726882188.96491: Set connection var ansible_connection to ssh 11579 1726882188.96519: variable 'ansible_shell_executable' from source: unknown 11579 1726882188.96528: variable 'ansible_connection' from source: unknown 11579 1726882188.96536: variable 'ansible_module_compression' from source: unknown 11579 1726882188.96542: variable 'ansible_shell_type' from source: unknown 11579 1726882188.96550: variable 'ansible_shell_executable' from source: unknown 11579 1726882188.96561: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882188.96570: variable 'ansible_pipelining' from source: unknown 11579 1726882188.96577: variable 'ansible_timeout' from source: unknown 11579 1726882188.96586: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882188.96929: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882188.96947: variable 'omit' from source: magic vars 11579 1726882188.96958: starting attempt loop 11579 1726882188.96965: running the handler 11579 1726882188.96983: handler run complete 11579 1726882188.97198: attempt loop complete, returning result 11579 1726882188.97203: _execute() done 11579 1726882188.97206: dumping result to json 11579 1726882188.97209: done dumping result, returning 11579 1726882188.97211: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-f197-7423-0000000003b0] 11579 1726882188.97213: sending task result for task 12673a56-9f93-f197-7423-0000000003b0 11579 1726882188.97282: done sending task result for task 12673a56-9f93-f197-7423-0000000003b0 11579 1726882188.97286: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11579 1726882188.97343: no more pending results, returning what we have 11579 1726882188.97347: results queue empty 11579 1726882188.97348: checking for any_errors_fatal 11579 1726882188.97350: done checking for any_errors_fatal 11579 1726882188.97351: checking for max_fail_percentage 11579 1726882188.97353: done checking for max_fail_percentage 11579 1726882188.97354: checking to see if all hosts have failed and the running result is not ok 11579 1726882188.97355: done checking to see if all hosts have failed 11579 1726882188.97356: getting the remaining hosts for this loop 11579 1726882188.97357: done getting the remaining hosts for this loop 11579 1726882188.97361: getting the next task for host managed_node1 11579 1726882188.97370: done getting next task for host managed_node1 11579 1726882188.97372: ^ task is: TASK: Stat profile file 11579 1726882188.97377: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882188.97381: getting variables 11579 1726882188.97384: in VariableManager get_vars() 11579 1726882188.97433: Calling all_inventory to load vars for managed_node1 11579 1726882188.97436: Calling groups_inventory to load vars for managed_node1 11579 1726882188.97439: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882188.97452: Calling all_plugins_play to load vars for managed_node1 11579 1726882188.97456: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882188.97459: Calling groups_plugins_play to load vars for managed_node1 11579 1726882189.00065: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882189.01636: done with get_vars() 11579 1726882189.01657: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:29:49 -0400 (0:00:00.081) 0:00:17.725 ****** 11579 1726882189.01749: entering _queue_task() for managed_node1/stat 11579 1726882189.02050: worker is 1 (out of 1 available) 11579 1726882189.02062: exiting _queue_task() for managed_node1/stat 11579 1726882189.02072: done queuing things up, now waiting for results queue to drain 11579 1726882189.02074: waiting for pending results... 11579 1726882189.02514: running TaskExecutor() for managed_node1/TASK: Stat profile file 11579 1726882189.02519: in run() - task 12673a56-9f93-f197-7423-0000000003b1 11579 1726882189.02522: variable 'ansible_search_path' from source: unknown 11579 1726882189.02524: variable 'ansible_search_path' from source: unknown 11579 1726882189.02527: calling self._execute() 11579 1726882189.02600: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882189.02613: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882189.02629: variable 'omit' from source: magic vars 11579 1726882189.02988: variable 'ansible_distribution_major_version' from source: facts 11579 1726882189.03008: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882189.03018: variable 'omit' from source: magic vars 11579 1726882189.03065: variable 'omit' from source: magic vars 11579 1726882189.03185: variable 'profile' from source: include params 11579 1726882189.03188: variable 'item' from source: include params 11579 1726882189.03248: variable 'item' from source: include params 11579 1726882189.03294: variable 'omit' from source: magic vars 11579 1726882189.03316: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882189.03356: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882189.03379: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882189.03598: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882189.03601: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882189.03604: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882189.03606: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882189.03608: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882189.03610: Set connection var ansible_timeout to 10 11579 1726882189.03612: Set connection var ansible_shell_type to sh 11579 1726882189.03614: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882189.03616: Set connection var ansible_shell_executable to /bin/sh 11579 1726882189.03618: Set connection var ansible_pipelining to False 11579 1726882189.03621: Set connection var ansible_connection to ssh 11579 1726882189.03623: variable 'ansible_shell_executable' from source: unknown 11579 1726882189.03625: variable 'ansible_connection' from source: unknown 11579 1726882189.03627: variable 'ansible_module_compression' from source: unknown 11579 1726882189.03633: variable 'ansible_shell_type' from source: unknown 11579 1726882189.03641: variable 'ansible_shell_executable' from source: unknown 11579 1726882189.03647: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882189.03654: variable 'ansible_pipelining' from source: unknown 11579 1726882189.03661: variable 'ansible_timeout' from source: unknown 11579 1726882189.03667: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882189.03862: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11579 1726882189.03879: variable 'omit' from source: magic vars 11579 1726882189.03889: starting attempt loop 11579 1726882189.03898: running the handler 11579 1726882189.03917: _low_level_execute_command(): starting 11579 1726882189.03931: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882189.05018: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882189.05054: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882189.05069: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882189.05386: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882189.05607: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882189.05720: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882189.07385: stdout chunk (state=3): >>>/root <<< 11579 1726882189.07519: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882189.07529: stdout chunk (state=3): >>><<< 11579 1726882189.07541: stderr chunk (state=3): >>><<< 11579 1726882189.07570: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882189.07590: _low_level_execute_command(): starting 11579 1726882189.07673: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882189.0757818-12395-199685286969928 `" && echo ansible-tmp-1726882189.0757818-12395-199685286969928="` echo /root/.ansible/tmp/ansible-tmp-1726882189.0757818-12395-199685286969928 `" ) && sleep 0' 11579 1726882189.08719: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882189.08732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882189.08756: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882189.08865: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882189.08876: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882189.09010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882189.09084: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882189.10977: stdout chunk (state=3): >>>ansible-tmp-1726882189.0757818-12395-199685286969928=/root/.ansible/tmp/ansible-tmp-1726882189.0757818-12395-199685286969928 <<< 11579 1726882189.11172: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882189.11176: stdout chunk (state=3): >>><<< 11579 1726882189.11183: stderr chunk (state=3): >>><<< 11579 1726882189.11416: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882189.0757818-12395-199685286969928=/root/.ansible/tmp/ansible-tmp-1726882189.0757818-12395-199685286969928 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882189.11464: variable 'ansible_module_compression' from source: unknown 11579 1726882189.11530: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11579 1726882189.11569: variable 'ansible_facts' from source: unknown 11579 1726882189.11749: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882189.0757818-12395-199685286969928/AnsiballZ_stat.py 11579 1726882189.12229: Sending initial data 11579 1726882189.12233: Sent initial data (153 bytes) 11579 1726882189.13256: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882189.13260: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882189.13312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882189.13316: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882189.13332: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 11579 1726882189.13338: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882189.13343: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882189.13349: stderr chunk (state=3): >>>debug2: match found <<< 11579 1726882189.13535: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882189.13539: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882189.13551: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882189.13567: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882189.13638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882189.15153: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882189.15192: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882189.15238: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpqx3gcsyx /root/.ansible/tmp/ansible-tmp-1726882189.0757818-12395-199685286969928/AnsiballZ_stat.py <<< 11579 1726882189.15242: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882189.0757818-12395-199685286969928/AnsiballZ_stat.py" <<< 11579 1726882189.15281: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpqx3gcsyx" to remote "/root/.ansible/tmp/ansible-tmp-1726882189.0757818-12395-199685286969928/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882189.0757818-12395-199685286969928/AnsiballZ_stat.py" <<< 11579 1726882189.16772: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882189.16783: stderr chunk (state=3): >>><<< 11579 1726882189.16799: stdout chunk (state=3): >>><<< 11579 1726882189.17001: done transferring module to remote 11579 1726882189.17005: _low_level_execute_command(): starting 11579 1726882189.17007: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882189.0757818-12395-199685286969928/ /root/.ansible/tmp/ansible-tmp-1726882189.0757818-12395-199685286969928/AnsiballZ_stat.py && sleep 0' 11579 1726882189.18298: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882189.18510: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882189.18581: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882189.20331: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882189.20341: stdout chunk (state=3): >>><<< 11579 1726882189.20561: stderr chunk (state=3): >>><<< 11579 1726882189.20577: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882189.20588: _low_level_execute_command(): starting 11579 1726882189.20591: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882189.0757818-12395-199685286969928/AnsiballZ_stat.py && sleep 0' 11579 1726882189.21222: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882189.21309: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882189.21345: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882189.21348: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882189.21366: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882189.21446: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882189.36399: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11579 1726882189.37565: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882189.37590: stderr chunk (state=3): >>><<< 11579 1726882189.37596: stdout chunk (state=3): >>><<< 11579 1726882189.37612: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882189.37636: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882189.0757818-12395-199685286969928/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882189.37643: _low_level_execute_command(): starting 11579 1726882189.37648: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882189.0757818-12395-199685286969928/ > /dev/null 2>&1 && sleep 0' 11579 1726882189.38059: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882189.38063: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882189.38065: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882189.38068: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 11579 1726882189.38070: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882189.38123: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882189.38129: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882189.38170: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882189.40100: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882189.40103: stdout chunk (state=3): >>><<< 11579 1726882189.40106: stderr chunk (state=3): >>><<< 11579 1726882189.40108: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882189.40111: handler run complete 11579 1726882189.40113: attempt loop complete, returning result 11579 1726882189.40115: _execute() done 11579 1726882189.40116: dumping result to json 11579 1726882189.40118: done dumping result, returning 11579 1726882189.40120: done running TaskExecutor() for managed_node1/TASK: Stat profile file [12673a56-9f93-f197-7423-0000000003b1] 11579 1726882189.40121: sending task result for task 12673a56-9f93-f197-7423-0000000003b1 11579 1726882189.40180: done sending task result for task 12673a56-9f93-f197-7423-0000000003b1 11579 1726882189.40182: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11579 1726882189.40254: no more pending results, returning what we have 11579 1726882189.40257: results queue empty 11579 1726882189.40258: checking for any_errors_fatal 11579 1726882189.40266: done checking for any_errors_fatal 11579 1726882189.40267: checking for max_fail_percentage 11579 1726882189.40269: done checking for max_fail_percentage 11579 1726882189.40270: checking to see if all hosts have failed and the running result is not ok 11579 1726882189.40271: done checking to see if all hosts have failed 11579 1726882189.40272: getting the remaining hosts for this loop 11579 1726882189.40273: done getting the remaining hosts for this loop 11579 1726882189.40277: getting the next task for host managed_node1 11579 1726882189.40285: done getting next task for host managed_node1 11579 1726882189.40287: ^ task is: TASK: Set NM profile exist flag based on the profile files 11579 1726882189.40292: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882189.40299: getting variables 11579 1726882189.40301: in VariableManager get_vars() 11579 1726882189.40346: Calling all_inventory to load vars for managed_node1 11579 1726882189.40349: Calling groups_inventory to load vars for managed_node1 11579 1726882189.40351: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882189.40363: Calling all_plugins_play to load vars for managed_node1 11579 1726882189.40366: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882189.40369: Calling groups_plugins_play to load vars for managed_node1 11579 1726882189.42029: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882189.43707: done with get_vars() 11579 1726882189.43735: done getting variables 11579 1726882189.43802: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:29:49 -0400 (0:00:00.420) 0:00:18.146 ****** 11579 1726882189.43833: entering _queue_task() for managed_node1/set_fact 11579 1726882189.44298: worker is 1 (out of 1 available) 11579 1726882189.44309: exiting _queue_task() for managed_node1/set_fact 11579 1726882189.44319: done queuing things up, now waiting for results queue to drain 11579 1726882189.44321: waiting for pending results... 11579 1726882189.44604: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 11579 1726882189.44614: in run() - task 12673a56-9f93-f197-7423-0000000003b2 11579 1726882189.44627: variable 'ansible_search_path' from source: unknown 11579 1726882189.44630: variable 'ansible_search_path' from source: unknown 11579 1726882189.44668: calling self._execute() 11579 1726882189.44772: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882189.44778: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882189.44788: variable 'omit' from source: magic vars 11579 1726882189.45186: variable 'ansible_distribution_major_version' from source: facts 11579 1726882189.45204: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882189.45335: variable 'profile_stat' from source: set_fact 11579 1726882189.45348: Evaluated conditional (profile_stat.stat.exists): False 11579 1726882189.45352: when evaluation is False, skipping this task 11579 1726882189.45356: _execute() done 11579 1726882189.45358: dumping result to json 11579 1726882189.45361: done dumping result, returning 11579 1726882189.45367: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-f197-7423-0000000003b2] 11579 1726882189.45499: sending task result for task 12673a56-9f93-f197-7423-0000000003b2 11579 1726882189.45558: done sending task result for task 12673a56-9f93-f197-7423-0000000003b2 11579 1726882189.45561: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11579 1726882189.45633: no more pending results, returning what we have 11579 1726882189.45637: results queue empty 11579 1726882189.45638: checking for any_errors_fatal 11579 1726882189.45643: done checking for any_errors_fatal 11579 1726882189.45643: checking for max_fail_percentage 11579 1726882189.45645: done checking for max_fail_percentage 11579 1726882189.45646: checking to see if all hosts have failed and the running result is not ok 11579 1726882189.45646: done checking to see if all hosts have failed 11579 1726882189.45647: getting the remaining hosts for this loop 11579 1726882189.45649: done getting the remaining hosts for this loop 11579 1726882189.45652: getting the next task for host managed_node1 11579 1726882189.45658: done getting next task for host managed_node1 11579 1726882189.45660: ^ task is: TASK: Get NM profile info 11579 1726882189.45663: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882189.45667: getting variables 11579 1726882189.45668: in VariableManager get_vars() 11579 1726882189.45821: Calling all_inventory to load vars for managed_node1 11579 1726882189.45824: Calling groups_inventory to load vars for managed_node1 11579 1726882189.45826: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882189.45834: Calling all_plugins_play to load vars for managed_node1 11579 1726882189.45837: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882189.45840: Calling groups_plugins_play to load vars for managed_node1 11579 1726882189.47291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882189.48954: done with get_vars() 11579 1726882189.48974: done getting variables 11579 1726882189.49033: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:29:49 -0400 (0:00:00.052) 0:00:18.199 ****** 11579 1726882189.49070: entering _queue_task() for managed_node1/shell 11579 1726882189.49483: worker is 1 (out of 1 available) 11579 1726882189.49495: exiting _queue_task() for managed_node1/shell 11579 1726882189.49505: done queuing things up, now waiting for results queue to drain 11579 1726882189.49507: waiting for pending results... 11579 1726882189.49675: running TaskExecutor() for managed_node1/TASK: Get NM profile info 11579 1726882189.49902: in run() - task 12673a56-9f93-f197-7423-0000000003b3 11579 1726882189.49906: variable 'ansible_search_path' from source: unknown 11579 1726882189.49908: variable 'ansible_search_path' from source: unknown 11579 1726882189.49912: calling self._execute() 11579 1726882189.49968: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882189.49972: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882189.50180: variable 'omit' from source: magic vars 11579 1726882189.50626: variable 'ansible_distribution_major_version' from source: facts 11579 1726882189.50639: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882189.50645: variable 'omit' from source: magic vars 11579 1726882189.50696: variable 'omit' from source: magic vars 11579 1726882189.51002: variable 'profile' from source: include params 11579 1726882189.51005: variable 'item' from source: include params 11579 1726882189.51081: variable 'item' from source: include params 11579 1726882189.51099: variable 'omit' from source: magic vars 11579 1726882189.51137: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882189.51297: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882189.51385: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882189.51412: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882189.51415: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882189.51447: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882189.51451: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882189.51453: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882189.51637: Set connection var ansible_timeout to 10 11579 1726882189.51643: Set connection var ansible_shell_type to sh 11579 1726882189.51653: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882189.51691: Set connection var ansible_shell_executable to /bin/sh 11579 1726882189.51699: Set connection var ansible_pipelining to False 11579 1726882189.51702: Set connection var ansible_connection to ssh 11579 1726882189.51722: variable 'ansible_shell_executable' from source: unknown 11579 1726882189.51725: variable 'ansible_connection' from source: unknown 11579 1726882189.51728: variable 'ansible_module_compression' from source: unknown 11579 1726882189.51731: variable 'ansible_shell_type' from source: unknown 11579 1726882189.51733: variable 'ansible_shell_executable' from source: unknown 11579 1726882189.51735: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882189.51737: variable 'ansible_pipelining' from source: unknown 11579 1726882189.51741: variable 'ansible_timeout' from source: unknown 11579 1726882189.51743: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882189.52101: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882189.52106: variable 'omit' from source: magic vars 11579 1726882189.52108: starting attempt loop 11579 1726882189.52110: running the handler 11579 1726882189.52112: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882189.52115: _low_level_execute_command(): starting 11579 1726882189.52117: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882189.52690: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882189.52792: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882189.52812: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882189.52879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882189.54501: stdout chunk (state=3): >>>/root <<< 11579 1726882189.54575: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882189.54581: stdout chunk (state=3): >>><<< 11579 1726882189.54589: stderr chunk (state=3): >>><<< 11579 1726882189.54664: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882189.54677: _low_level_execute_command(): starting 11579 1726882189.54681: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882189.5466347-12417-95969461824172 `" && echo ansible-tmp-1726882189.5466347-12417-95969461824172="` echo /root/.ansible/tmp/ansible-tmp-1726882189.5466347-12417-95969461824172 `" ) && sleep 0' 11579 1726882189.55901: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882189.55905: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882189.55928: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882189.56113: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882189.56203: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882189.58044: stdout chunk (state=3): >>>ansible-tmp-1726882189.5466347-12417-95969461824172=/root/.ansible/tmp/ansible-tmp-1726882189.5466347-12417-95969461824172 <<< 11579 1726882189.58173: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882189.58183: stdout chunk (state=3): >>><<< 11579 1726882189.58200: stderr chunk (state=3): >>><<< 11579 1726882189.58222: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882189.5466347-12417-95969461824172=/root/.ansible/tmp/ansible-tmp-1726882189.5466347-12417-95969461824172 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882189.58258: variable 'ansible_module_compression' from source: unknown 11579 1726882189.58500: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11579 1726882189.58503: variable 'ansible_facts' from source: unknown 11579 1726882189.58576: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882189.5466347-12417-95969461824172/AnsiballZ_command.py 11579 1726882189.59121: Sending initial data 11579 1726882189.59124: Sent initial data (155 bytes) 11579 1726882189.60139: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882189.60152: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882189.60162: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882189.60273: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882189.60285: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882189.60416: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882189.60429: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882189.60502: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882189.60609: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882189.62332: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882189.62463: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882189.62516: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpgptfk__1 /root/.ansible/tmp/ansible-tmp-1726882189.5466347-12417-95969461824172/AnsiballZ_command.py <<< 11579 1726882189.62520: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882189.5466347-12417-95969461824172/AnsiballZ_command.py" <<< 11579 1726882189.62564: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpgptfk__1" to remote "/root/.ansible/tmp/ansible-tmp-1726882189.5466347-12417-95969461824172/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882189.5466347-12417-95969461824172/AnsiballZ_command.py" <<< 11579 1726882189.63845: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882189.63849: stderr chunk (state=3): >>><<< 11579 1726882189.63852: stdout chunk (state=3): >>><<< 11579 1726882189.64006: done transferring module to remote 11579 1726882189.64016: _low_level_execute_command(): starting 11579 1726882189.64030: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882189.5466347-12417-95969461824172/ /root/.ansible/tmp/ansible-tmp-1726882189.5466347-12417-95969461824172/AnsiballZ_command.py && sleep 0' 11579 1726882189.65091: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882189.65120: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882189.65176: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882189.66920: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882189.66959: stderr chunk (state=3): >>><<< 11579 1726882189.66962: stdout chunk (state=3): >>><<< 11579 1726882189.66981: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882189.66984: _low_level_execute_command(): starting 11579 1726882189.66987: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882189.5466347-12417-95969461824172/AnsiballZ_command.py && sleep 0' 11579 1726882189.67547: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882189.67550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882189.67553: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882189.67565: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882189.67577: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882189.67584: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882189.67598: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882189.67758: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882189.67761: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 11579 1726882189.67763: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11579 1726882189.67764: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882189.67766: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882189.67768: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882189.67770: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882189.67772: stderr chunk (state=3): >>>debug2: match found <<< 11579 1726882189.67774: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882189.67777: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882189.67909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882189.67940: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882189.68077: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882189.85265: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:29:49.827145", "end": "2024-09-20 21:29:49.851102", "delta": "0:00:00.023957", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11579 1726882189.86707: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882189.86711: stdout chunk (state=3): >>><<< 11579 1726882189.86714: stderr chunk (state=3): >>><<< 11579 1726882189.86740: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection \nbond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection \nbond0 /etc/NetworkManager/system-connections/bond0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "start": "2024-09-20 21:29:49.827145", "end": "2024-09-20 21:29:49.851102", "delta": "0:00:00.023957", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882189.86779: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882189.5466347-12417-95969461824172/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882189.86787: _low_level_execute_command(): starting 11579 1726882189.86794: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882189.5466347-12417-95969461824172/ > /dev/null 2>&1 && sleep 0' 11579 1726882189.87441: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882189.87556: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882189.87559: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882189.87561: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882189.87583: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882189.87590: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882189.87613: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882189.87627: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882189.87699: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882189.89456: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882189.89475: stderr chunk (state=3): >>><<< 11579 1726882189.89478: stdout chunk (state=3): >>><<< 11579 1726882189.89490: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882189.89498: handler run complete 11579 1726882189.89521: Evaluated conditional (False): False 11579 1726882189.89528: attempt loop complete, returning result 11579 1726882189.89531: _execute() done 11579 1726882189.89534: dumping result to json 11579 1726882189.89539: done dumping result, returning 11579 1726882189.89546: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [12673a56-9f93-f197-7423-0000000003b3] 11579 1726882189.89550: sending task result for task 12673a56-9f93-f197-7423-0000000003b3 11579 1726882189.89643: done sending task result for task 12673a56-9f93-f197-7423-0000000003b3 11579 1726882189.89646: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0 | grep /etc", "delta": "0:00:00.023957", "end": "2024-09-20 21:29:49.851102", "rc": 0, "start": "2024-09-20 21:29:49.827145" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection bond0 /etc/NetworkManager/system-connections/bond0.nmconnection 11579 1726882189.89712: no more pending results, returning what we have 11579 1726882189.89715: results queue empty 11579 1726882189.89716: checking for any_errors_fatal 11579 1726882189.89720: done checking for any_errors_fatal 11579 1726882189.89721: checking for max_fail_percentage 11579 1726882189.89722: done checking for max_fail_percentage 11579 1726882189.89723: checking to see if all hosts have failed and the running result is not ok 11579 1726882189.89724: done checking to see if all hosts have failed 11579 1726882189.89725: getting the remaining hosts for this loop 11579 1726882189.89727: done getting the remaining hosts for this loop 11579 1726882189.89730: getting the next task for host managed_node1 11579 1726882189.89737: done getting next task for host managed_node1 11579 1726882189.89738: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11579 1726882189.89742: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882189.89745: getting variables 11579 1726882189.89747: in VariableManager get_vars() 11579 1726882189.89786: Calling all_inventory to load vars for managed_node1 11579 1726882189.89788: Calling groups_inventory to load vars for managed_node1 11579 1726882189.89790: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882189.89809: Calling all_plugins_play to load vars for managed_node1 11579 1726882189.89812: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882189.89815: Calling groups_plugins_play to load vars for managed_node1 11579 1726882189.90592: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882189.91559: done with get_vars() 11579 1726882189.91574: done getting variables 11579 1726882189.91619: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:29:49 -0400 (0:00:00.425) 0:00:18.624 ****** 11579 1726882189.91642: entering _queue_task() for managed_node1/set_fact 11579 1726882189.91856: worker is 1 (out of 1 available) 11579 1726882189.91868: exiting _queue_task() for managed_node1/set_fact 11579 1726882189.91879: done queuing things up, now waiting for results queue to drain 11579 1726882189.91880: waiting for pending results... 11579 1726882189.92043: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11579 1726882189.92117: in run() - task 12673a56-9f93-f197-7423-0000000003b4 11579 1726882189.92129: variable 'ansible_search_path' from source: unknown 11579 1726882189.92132: variable 'ansible_search_path' from source: unknown 11579 1726882189.92159: calling self._execute() 11579 1726882189.92231: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882189.92234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882189.92244: variable 'omit' from source: magic vars 11579 1726882189.92537: variable 'ansible_distribution_major_version' from source: facts 11579 1726882189.92551: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882189.92637: variable 'nm_profile_exists' from source: set_fact 11579 1726882189.92651: Evaluated conditional (nm_profile_exists.rc == 0): True 11579 1726882189.92655: variable 'omit' from source: magic vars 11579 1726882189.92690: variable 'omit' from source: magic vars 11579 1726882189.92714: variable 'omit' from source: magic vars 11579 1726882189.92746: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882189.92775: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882189.92789: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882189.92804: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882189.92813: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882189.92837: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882189.92840: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882189.92842: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882189.92914: Set connection var ansible_timeout to 10 11579 1726882189.92917: Set connection var ansible_shell_type to sh 11579 1726882189.92924: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882189.92929: Set connection var ansible_shell_executable to /bin/sh 11579 1726882189.92935: Set connection var ansible_pipelining to False 11579 1726882189.92937: Set connection var ansible_connection to ssh 11579 1726882189.92953: variable 'ansible_shell_executable' from source: unknown 11579 1726882189.92955: variable 'ansible_connection' from source: unknown 11579 1726882189.92958: variable 'ansible_module_compression' from source: unknown 11579 1726882189.92960: variable 'ansible_shell_type' from source: unknown 11579 1726882189.92962: variable 'ansible_shell_executable' from source: unknown 11579 1726882189.92964: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882189.92969: variable 'ansible_pipelining' from source: unknown 11579 1726882189.92971: variable 'ansible_timeout' from source: unknown 11579 1726882189.92974: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882189.93072: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882189.93080: variable 'omit' from source: magic vars 11579 1726882189.93085: starting attempt loop 11579 1726882189.93087: running the handler 11579 1726882189.93102: handler run complete 11579 1726882189.93111: attempt loop complete, returning result 11579 1726882189.93113: _execute() done 11579 1726882189.93116: dumping result to json 11579 1726882189.93118: done dumping result, returning 11579 1726882189.93127: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-f197-7423-0000000003b4] 11579 1726882189.93131: sending task result for task 12673a56-9f93-f197-7423-0000000003b4 11579 1726882189.93210: done sending task result for task 12673a56-9f93-f197-7423-0000000003b4 11579 1726882189.93214: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11579 1726882189.93266: no more pending results, returning what we have 11579 1726882189.93269: results queue empty 11579 1726882189.93270: checking for any_errors_fatal 11579 1726882189.93281: done checking for any_errors_fatal 11579 1726882189.93282: checking for max_fail_percentage 11579 1726882189.93283: done checking for max_fail_percentage 11579 1726882189.93284: checking to see if all hosts have failed and the running result is not ok 11579 1726882189.93285: done checking to see if all hosts have failed 11579 1726882189.93286: getting the remaining hosts for this loop 11579 1726882189.93287: done getting the remaining hosts for this loop 11579 1726882189.93290: getting the next task for host managed_node1 11579 1726882189.93303: done getting next task for host managed_node1 11579 1726882189.93305: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11579 1726882189.93308: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882189.93311: getting variables 11579 1726882189.93312: in VariableManager get_vars() 11579 1726882189.93346: Calling all_inventory to load vars for managed_node1 11579 1726882189.93348: Calling groups_inventory to load vars for managed_node1 11579 1726882189.93350: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882189.93358: Calling all_plugins_play to load vars for managed_node1 11579 1726882189.93361: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882189.93363: Calling groups_plugins_play to load vars for managed_node1 11579 1726882189.94471: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882189.95448: done with get_vars() 11579 1726882189.95463: done getting variables 11579 1726882189.95510: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882189.95597: variable 'profile' from source: include params 11579 1726882189.95600: variable 'item' from source: include params 11579 1726882189.95642: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:29:49 -0400 (0:00:00.040) 0:00:18.665 ****** 11579 1726882189.95667: entering _queue_task() for managed_node1/command 11579 1726882189.95904: worker is 1 (out of 1 available) 11579 1726882189.95919: exiting _queue_task() for managed_node1/command 11579 1726882189.95929: done queuing things up, now waiting for results queue to drain 11579 1726882189.95931: waiting for pending results... 11579 1726882189.96091: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0 11579 1726882189.96174: in run() - task 12673a56-9f93-f197-7423-0000000003b6 11579 1726882189.96186: variable 'ansible_search_path' from source: unknown 11579 1726882189.96189: variable 'ansible_search_path' from source: unknown 11579 1726882189.96223: calling self._execute() 11579 1726882189.96296: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882189.96303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882189.96313: variable 'omit' from source: magic vars 11579 1726882189.97002: variable 'ansible_distribution_major_version' from source: facts 11579 1726882189.97005: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882189.97007: variable 'profile_stat' from source: set_fact 11579 1726882189.97009: Evaluated conditional (profile_stat.stat.exists): False 11579 1726882189.97011: when evaluation is False, skipping this task 11579 1726882189.97013: _execute() done 11579 1726882189.97015: dumping result to json 11579 1726882189.97017: done dumping result, returning 11579 1726882189.97019: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0 [12673a56-9f93-f197-7423-0000000003b6] 11579 1726882189.97020: sending task result for task 12673a56-9f93-f197-7423-0000000003b6 11579 1726882189.97076: done sending task result for task 12673a56-9f93-f197-7423-0000000003b6 11579 1726882189.97079: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11579 1726882189.97137: no more pending results, returning what we have 11579 1726882189.97139: results queue empty 11579 1726882189.97140: checking for any_errors_fatal 11579 1726882189.97144: done checking for any_errors_fatal 11579 1726882189.97145: checking for max_fail_percentage 11579 1726882189.97146: done checking for max_fail_percentage 11579 1726882189.97147: checking to see if all hosts have failed and the running result is not ok 11579 1726882189.97148: done checking to see if all hosts have failed 11579 1726882189.97149: getting the remaining hosts for this loop 11579 1726882189.97150: done getting the remaining hosts for this loop 11579 1726882189.97153: getting the next task for host managed_node1 11579 1726882189.97158: done getting next task for host managed_node1 11579 1726882189.97159: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11579 1726882189.97163: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882189.97166: getting variables 11579 1726882189.97167: in VariableManager get_vars() 11579 1726882189.97199: Calling all_inventory to load vars for managed_node1 11579 1726882189.97201: Calling groups_inventory to load vars for managed_node1 11579 1726882189.97203: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882189.97212: Calling all_plugins_play to load vars for managed_node1 11579 1726882189.97214: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882189.97216: Calling groups_plugins_play to load vars for managed_node1 11579 1726882189.98787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882190.00375: done with get_vars() 11579 1726882190.00402: done getting variables 11579 1726882190.00464: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882190.00580: variable 'profile' from source: include params 11579 1726882190.00584: variable 'item' from source: include params 11579 1726882190.00645: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0] *********************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:29:50 -0400 (0:00:00.050) 0:00:18.715 ****** 11579 1726882190.00674: entering _queue_task() for managed_node1/set_fact 11579 1726882190.01020: worker is 1 (out of 1 available) 11579 1726882190.01032: exiting _queue_task() for managed_node1/set_fact 11579 1726882190.01044: done queuing things up, now waiting for results queue to drain 11579 1726882190.01046: waiting for pending results... 11579 1726882190.01347: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 11579 1726882190.01486: in run() - task 12673a56-9f93-f197-7423-0000000003b7 11579 1726882190.01491: variable 'ansible_search_path' from source: unknown 11579 1726882190.01498: variable 'ansible_search_path' from source: unknown 11579 1726882190.01514: calling self._execute() 11579 1726882190.01624: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.01628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.01651: variable 'omit' from source: magic vars 11579 1726882190.02037: variable 'ansible_distribution_major_version' from source: facts 11579 1726882190.02050: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882190.02207: variable 'profile_stat' from source: set_fact 11579 1726882190.02211: Evaluated conditional (profile_stat.stat.exists): False 11579 1726882190.02213: when evaluation is False, skipping this task 11579 1726882190.02216: _execute() done 11579 1726882190.02218: dumping result to json 11579 1726882190.02228: done dumping result, returning 11579 1726882190.02231: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0 [12673a56-9f93-f197-7423-0000000003b7] 11579 1726882190.02233: sending task result for task 12673a56-9f93-f197-7423-0000000003b7 11579 1726882190.02447: done sending task result for task 12673a56-9f93-f197-7423-0000000003b7 11579 1726882190.02451: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11579 1726882190.02502: no more pending results, returning what we have 11579 1726882190.02506: results queue empty 11579 1726882190.02507: checking for any_errors_fatal 11579 1726882190.02512: done checking for any_errors_fatal 11579 1726882190.02513: checking for max_fail_percentage 11579 1726882190.02515: done checking for max_fail_percentage 11579 1726882190.02516: checking to see if all hosts have failed and the running result is not ok 11579 1726882190.02517: done checking to see if all hosts have failed 11579 1726882190.02518: getting the remaining hosts for this loop 11579 1726882190.02519: done getting the remaining hosts for this loop 11579 1726882190.02523: getting the next task for host managed_node1 11579 1726882190.02530: done getting next task for host managed_node1 11579 1726882190.02533: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11579 1726882190.02537: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882190.02541: getting variables 11579 1726882190.02543: in VariableManager get_vars() 11579 1726882190.02587: Calling all_inventory to load vars for managed_node1 11579 1726882190.02590: Calling groups_inventory to load vars for managed_node1 11579 1726882190.02597: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882190.02610: Calling all_plugins_play to load vars for managed_node1 11579 1726882190.02614: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882190.02617: Calling groups_plugins_play to load vars for managed_node1 11579 1726882190.04084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882190.05020: done with get_vars() 11579 1726882190.05036: done getting variables 11579 1726882190.05081: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882190.05165: variable 'profile' from source: include params 11579 1726882190.05168: variable 'item' from source: include params 11579 1726882190.05210: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0] ****************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:29:50 -0400 (0:00:00.045) 0:00:18.760 ****** 11579 1726882190.05234: entering _queue_task() for managed_node1/command 11579 1726882190.05476: worker is 1 (out of 1 available) 11579 1726882190.05487: exiting _queue_task() for managed_node1/command 11579 1726882190.05503: done queuing things up, now waiting for results queue to drain 11579 1726882190.05504: waiting for pending results... 11579 1726882190.05710: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0 11579 1726882190.05841: in run() - task 12673a56-9f93-f197-7423-0000000003b8 11579 1726882190.05864: variable 'ansible_search_path' from source: unknown 11579 1726882190.05871: variable 'ansible_search_path' from source: unknown 11579 1726882190.05911: calling self._execute() 11579 1726882190.06065: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.06069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.06118: variable 'omit' from source: magic vars 11579 1726882190.06526: variable 'ansible_distribution_major_version' from source: facts 11579 1726882190.06529: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882190.06901: variable 'profile_stat' from source: set_fact 11579 1726882190.06905: Evaluated conditional (profile_stat.stat.exists): False 11579 1726882190.06907: when evaluation is False, skipping this task 11579 1726882190.06909: _execute() done 11579 1726882190.06911: dumping result to json 11579 1726882190.06912: done dumping result, returning 11579 1726882190.06914: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0 [12673a56-9f93-f197-7423-0000000003b8] 11579 1726882190.06916: sending task result for task 12673a56-9f93-f197-7423-0000000003b8 11579 1726882190.06972: done sending task result for task 12673a56-9f93-f197-7423-0000000003b8 11579 1726882190.06976: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11579 1726882190.07018: no more pending results, returning what we have 11579 1726882190.07021: results queue empty 11579 1726882190.07022: checking for any_errors_fatal 11579 1726882190.07027: done checking for any_errors_fatal 11579 1726882190.07027: checking for max_fail_percentage 11579 1726882190.07029: done checking for max_fail_percentage 11579 1726882190.07030: checking to see if all hosts have failed and the running result is not ok 11579 1726882190.07031: done checking to see if all hosts have failed 11579 1726882190.07031: getting the remaining hosts for this loop 11579 1726882190.07033: done getting the remaining hosts for this loop 11579 1726882190.07036: getting the next task for host managed_node1 11579 1726882190.07042: done getting next task for host managed_node1 11579 1726882190.07045: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11579 1726882190.07049: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882190.07052: getting variables 11579 1726882190.07054: in VariableManager get_vars() 11579 1726882190.07099: Calling all_inventory to load vars for managed_node1 11579 1726882190.07102: Calling groups_inventory to load vars for managed_node1 11579 1726882190.07104: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882190.07114: Calling all_plugins_play to load vars for managed_node1 11579 1726882190.07117: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882190.07120: Calling groups_plugins_play to load vars for managed_node1 11579 1726882190.08357: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882190.09206: done with get_vars() 11579 1726882190.09221: done getting variables 11579 1726882190.09261: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882190.09338: variable 'profile' from source: include params 11579 1726882190.09341: variable 'item' from source: include params 11579 1726882190.09376: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0] *************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:29:50 -0400 (0:00:00.041) 0:00:18.802 ****** 11579 1726882190.09403: entering _queue_task() for managed_node1/set_fact 11579 1726882190.09655: worker is 1 (out of 1 available) 11579 1726882190.09668: exiting _queue_task() for managed_node1/set_fact 11579 1726882190.09680: done queuing things up, now waiting for results queue to drain 11579 1726882190.09681: waiting for pending results... 11579 1726882190.10232: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0 11579 1726882190.10738: in run() - task 12673a56-9f93-f197-7423-0000000003b9 11579 1726882190.10742: variable 'ansible_search_path' from source: unknown 11579 1726882190.10745: variable 'ansible_search_path' from source: unknown 11579 1726882190.11102: calling self._execute() 11579 1726882190.11108: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.11127: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.11146: variable 'omit' from source: magic vars 11579 1726882190.11570: variable 'ansible_distribution_major_version' from source: facts 11579 1726882190.11582: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882190.11710: variable 'profile_stat' from source: set_fact 11579 1726882190.11721: Evaluated conditional (profile_stat.stat.exists): False 11579 1726882190.11724: when evaluation is False, skipping this task 11579 1726882190.11727: _execute() done 11579 1726882190.11730: dumping result to json 11579 1726882190.11732: done dumping result, returning 11579 1726882190.11741: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0 [12673a56-9f93-f197-7423-0000000003b9] 11579 1726882190.11747: sending task result for task 12673a56-9f93-f197-7423-0000000003b9 11579 1726882190.11834: done sending task result for task 12673a56-9f93-f197-7423-0000000003b9 11579 1726882190.11836: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11579 1726882190.11906: no more pending results, returning what we have 11579 1726882190.11910: results queue empty 11579 1726882190.11911: checking for any_errors_fatal 11579 1726882190.11918: done checking for any_errors_fatal 11579 1726882190.11918: checking for max_fail_percentage 11579 1726882190.11920: done checking for max_fail_percentage 11579 1726882190.11921: checking to see if all hosts have failed and the running result is not ok 11579 1726882190.11922: done checking to see if all hosts have failed 11579 1726882190.11923: getting the remaining hosts for this loop 11579 1726882190.11924: done getting the remaining hosts for this loop 11579 1726882190.11928: getting the next task for host managed_node1 11579 1726882190.11937: done getting next task for host managed_node1 11579 1726882190.11940: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11579 1726882190.11943: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882190.11947: getting variables 11579 1726882190.11949: in VariableManager get_vars() 11579 1726882190.11989: Calling all_inventory to load vars for managed_node1 11579 1726882190.11992: Calling groups_inventory to load vars for managed_node1 11579 1726882190.11998: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882190.12012: Calling all_plugins_play to load vars for managed_node1 11579 1726882190.12015: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882190.12018: Calling groups_plugins_play to load vars for managed_node1 11579 1726882190.14381: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882190.16251: done with get_vars() 11579 1726882190.16281: done getting variables 11579 1726882190.16348: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882190.16466: variable 'profile' from source: include params 11579 1726882190.16470: variable 'item' from source: include params 11579 1726882190.16530: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0'] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:29:50 -0400 (0:00:00.071) 0:00:18.874 ****** 11579 1726882190.16561: entering _queue_task() for managed_node1/assert 11579 1726882190.16909: worker is 1 (out of 1 available) 11579 1726882190.16923: exiting _queue_task() for managed_node1/assert 11579 1726882190.16933: done queuing things up, now waiting for results queue to drain 11579 1726882190.16934: waiting for pending results... 11579 1726882190.17222: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0' 11579 1726882190.17366: in run() - task 12673a56-9f93-f197-7423-000000000260 11579 1726882190.17380: variable 'ansible_search_path' from source: unknown 11579 1726882190.17384: variable 'ansible_search_path' from source: unknown 11579 1726882190.17448: calling self._execute() 11579 1726882190.17653: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.17656: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.17713: variable 'omit' from source: magic vars 11579 1726882190.18618: variable 'ansible_distribution_major_version' from source: facts 11579 1726882190.18621: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882190.18624: variable 'omit' from source: magic vars 11579 1726882190.18632: variable 'omit' from source: magic vars 11579 1726882190.18933: variable 'profile' from source: include params 11579 1726882190.18942: variable 'item' from source: include params 11579 1726882190.18998: variable 'item' from source: include params 11579 1726882190.19030: variable 'omit' from source: magic vars 11579 1726882190.19068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882190.19165: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882190.19207: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882190.19224: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882190.19270: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882190.19274: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882190.19276: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.19278: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.19377: Set connection var ansible_timeout to 10 11579 1726882190.19381: Set connection var ansible_shell_type to sh 11579 1726882190.19387: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882190.19403: Set connection var ansible_shell_executable to /bin/sh 11579 1726882190.19488: Set connection var ansible_pipelining to False 11579 1726882190.19492: Set connection var ansible_connection to ssh 11579 1726882190.19496: variable 'ansible_shell_executable' from source: unknown 11579 1726882190.19498: variable 'ansible_connection' from source: unknown 11579 1726882190.19500: variable 'ansible_module_compression' from source: unknown 11579 1726882190.19504: variable 'ansible_shell_type' from source: unknown 11579 1726882190.19506: variable 'ansible_shell_executable' from source: unknown 11579 1726882190.19508: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.19510: variable 'ansible_pipelining' from source: unknown 11579 1726882190.19512: variable 'ansible_timeout' from source: unknown 11579 1726882190.19514: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.19605: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882190.19615: variable 'omit' from source: magic vars 11579 1726882190.19622: starting attempt loop 11579 1726882190.19706: running the handler 11579 1726882190.19738: variable 'lsr_net_profile_exists' from source: set_fact 11579 1726882190.19744: Evaluated conditional (lsr_net_profile_exists): True 11579 1726882190.19750: handler run complete 11579 1726882190.19765: attempt loop complete, returning result 11579 1726882190.19768: _execute() done 11579 1726882190.19771: dumping result to json 11579 1726882190.19773: done dumping result, returning 11579 1726882190.19781: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0' [12673a56-9f93-f197-7423-000000000260] 11579 1726882190.19786: sending task result for task 12673a56-9f93-f197-7423-000000000260 11579 1726882190.19882: done sending task result for task 12673a56-9f93-f197-7423-000000000260 11579 1726882190.19884: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11579 1726882190.19966: no more pending results, returning what we have 11579 1726882190.19969: results queue empty 11579 1726882190.19970: checking for any_errors_fatal 11579 1726882190.19977: done checking for any_errors_fatal 11579 1726882190.19979: checking for max_fail_percentage 11579 1726882190.19980: done checking for max_fail_percentage 11579 1726882190.19981: checking to see if all hosts have failed and the running result is not ok 11579 1726882190.19983: done checking to see if all hosts have failed 11579 1726882190.19983: getting the remaining hosts for this loop 11579 1726882190.19985: done getting the remaining hosts for this loop 11579 1726882190.19988: getting the next task for host managed_node1 11579 1726882190.19999: done getting next task for host managed_node1 11579 1726882190.20002: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11579 1726882190.20005: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882190.20010: getting variables 11579 1726882190.20011: in VariableManager get_vars() 11579 1726882190.20058: Calling all_inventory to load vars for managed_node1 11579 1726882190.20061: Calling groups_inventory to load vars for managed_node1 11579 1726882190.20064: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882190.20075: Calling all_plugins_play to load vars for managed_node1 11579 1726882190.20079: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882190.20082: Calling groups_plugins_play to load vars for managed_node1 11579 1726882190.21685: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882190.23184: done with get_vars() 11579 1726882190.23209: done getting variables 11579 1726882190.23270: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882190.23383: variable 'profile' from source: include params 11579 1726882190.23387: variable 'item' from source: include params 11579 1726882190.23443: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0'] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:29:50 -0400 (0:00:00.069) 0:00:18.943 ****** 11579 1726882190.23474: entering _queue_task() for managed_node1/assert 11579 1726882190.23776: worker is 1 (out of 1 available) 11579 1726882190.23788: exiting _queue_task() for managed_node1/assert 11579 1726882190.23803: done queuing things up, now waiting for results queue to drain 11579 1726882190.23804: waiting for pending results... 11579 1726882190.24199: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0' 11579 1726882190.24205: in run() - task 12673a56-9f93-f197-7423-000000000261 11579 1726882190.24208: variable 'ansible_search_path' from source: unknown 11579 1726882190.24211: variable 'ansible_search_path' from source: unknown 11579 1726882190.24213: calling self._execute() 11579 1726882190.24308: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.24312: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.24322: variable 'omit' from source: magic vars 11579 1726882190.24687: variable 'ansible_distribution_major_version' from source: facts 11579 1726882190.24703: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882190.24710: variable 'omit' from source: magic vars 11579 1726882190.24752: variable 'omit' from source: magic vars 11579 1726882190.24854: variable 'profile' from source: include params 11579 1726882190.24859: variable 'item' from source: include params 11579 1726882190.24944: variable 'item' from source: include params 11579 1726882190.24947: variable 'omit' from source: magic vars 11579 1726882190.24980: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882190.25020: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882190.25052: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882190.25058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882190.25098: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882190.25102: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882190.25105: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.25110: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.25212: Set connection var ansible_timeout to 10 11579 1726882190.25217: Set connection var ansible_shell_type to sh 11579 1726882190.25269: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882190.25273: Set connection var ansible_shell_executable to /bin/sh 11579 1726882190.25275: Set connection var ansible_pipelining to False 11579 1726882190.25279: Set connection var ansible_connection to ssh 11579 1726882190.25281: variable 'ansible_shell_executable' from source: unknown 11579 1726882190.25283: variable 'ansible_connection' from source: unknown 11579 1726882190.25287: variable 'ansible_module_compression' from source: unknown 11579 1726882190.25290: variable 'ansible_shell_type' from source: unknown 11579 1726882190.25292: variable 'ansible_shell_executable' from source: unknown 11579 1726882190.25296: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.25298: variable 'ansible_pipelining' from source: unknown 11579 1726882190.25301: variable 'ansible_timeout' from source: unknown 11579 1726882190.25303: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.25423: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882190.25487: variable 'omit' from source: magic vars 11579 1726882190.25490: starting attempt loop 11579 1726882190.25494: running the handler 11579 1726882190.25551: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11579 1726882190.25554: Evaluated conditional (lsr_net_profile_ansible_managed): True 11579 1726882190.25561: handler run complete 11579 1726882190.25575: attempt loop complete, returning result 11579 1726882190.25578: _execute() done 11579 1726882190.25581: dumping result to json 11579 1726882190.25583: done dumping result, returning 11579 1726882190.25596: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0' [12673a56-9f93-f197-7423-000000000261] 11579 1726882190.25604: sending task result for task 12673a56-9f93-f197-7423-000000000261 11579 1726882190.25753: done sending task result for task 12673a56-9f93-f197-7423-000000000261 11579 1726882190.25756: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11579 1726882190.25845: no more pending results, returning what we have 11579 1726882190.25848: results queue empty 11579 1726882190.25849: checking for any_errors_fatal 11579 1726882190.25852: done checking for any_errors_fatal 11579 1726882190.25853: checking for max_fail_percentage 11579 1726882190.25855: done checking for max_fail_percentage 11579 1726882190.25855: checking to see if all hosts have failed and the running result is not ok 11579 1726882190.25856: done checking to see if all hosts have failed 11579 1726882190.25857: getting the remaining hosts for this loop 11579 1726882190.25859: done getting the remaining hosts for this loop 11579 1726882190.25862: getting the next task for host managed_node1 11579 1726882190.25867: done getting next task for host managed_node1 11579 1726882190.25870: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11579 1726882190.25872: ^ state is: HOST STATE: block=2, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882190.25875: getting variables 11579 1726882190.25877: in VariableManager get_vars() 11579 1726882190.25918: Calling all_inventory to load vars for managed_node1 11579 1726882190.25922: Calling groups_inventory to load vars for managed_node1 11579 1726882190.25924: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882190.25934: Calling all_plugins_play to load vars for managed_node1 11579 1726882190.25937: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882190.25940: Calling groups_plugins_play to load vars for managed_node1 11579 1726882190.27285: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882190.28915: done with get_vars() 11579 1726882190.28934: done getting variables 11579 1726882190.28997: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882190.29106: variable 'profile' from source: include params 11579 1726882190.29109: variable 'item' from source: include params 11579 1726882190.29162: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0] ***************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:29:50 -0400 (0:00:00.057) 0:00:19.000 ****** 11579 1726882190.29197: entering _queue_task() for managed_node1/assert 11579 1726882190.29479: worker is 1 (out of 1 available) 11579 1726882190.29491: exiting _queue_task() for managed_node1/assert 11579 1726882190.29607: done queuing things up, now waiting for results queue to drain 11579 1726882190.29609: waiting for pending results... 11579 1726882190.29889: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0 11579 1726882190.29898: in run() - task 12673a56-9f93-f197-7423-000000000262 11579 1726882190.29904: variable 'ansible_search_path' from source: unknown 11579 1726882190.29907: variable 'ansible_search_path' from source: unknown 11579 1726882190.29929: calling self._execute() 11579 1726882190.30015: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.30018: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.30037: variable 'omit' from source: magic vars 11579 1726882190.30382: variable 'ansible_distribution_major_version' from source: facts 11579 1726882190.30395: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882190.30404: variable 'omit' from source: magic vars 11579 1726882190.30442: variable 'omit' from source: magic vars 11579 1726882190.30543: variable 'profile' from source: include params 11579 1726882190.30546: variable 'item' from source: include params 11579 1726882190.30613: variable 'item' from source: include params 11579 1726882190.30631: variable 'omit' from source: magic vars 11579 1726882190.30670: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882190.30712: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882190.30732: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882190.30749: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882190.30759: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882190.30791: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882190.30796: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.30801: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.30905: Set connection var ansible_timeout to 10 11579 1726882190.30912: Set connection var ansible_shell_type to sh 11579 1726882190.30920: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882190.30925: Set connection var ansible_shell_executable to /bin/sh 11579 1726882190.30932: Set connection var ansible_pipelining to False 11579 1726882190.30935: Set connection var ansible_connection to ssh 11579 1726882190.30954: variable 'ansible_shell_executable' from source: unknown 11579 1726882190.30957: variable 'ansible_connection' from source: unknown 11579 1726882190.30959: variable 'ansible_module_compression' from source: unknown 11579 1726882190.30961: variable 'ansible_shell_type' from source: unknown 11579 1726882190.30963: variable 'ansible_shell_executable' from source: unknown 11579 1726882190.30965: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.31086: variable 'ansible_pipelining' from source: unknown 11579 1726882190.31090: variable 'ansible_timeout' from source: unknown 11579 1726882190.31092: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.31104: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882190.31114: variable 'omit' from source: magic vars 11579 1726882190.31119: starting attempt loop 11579 1726882190.31122: running the handler 11579 1726882190.31229: variable 'lsr_net_profile_fingerprint' from source: set_fact 11579 1726882190.31233: Evaluated conditional (lsr_net_profile_fingerprint): True 11579 1726882190.31240: handler run complete 11579 1726882190.31253: attempt loop complete, returning result 11579 1726882190.31256: _execute() done 11579 1726882190.31258: dumping result to json 11579 1726882190.31261: done dumping result, returning 11579 1726882190.31270: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0 [12673a56-9f93-f197-7423-000000000262] 11579 1726882190.31275: sending task result for task 12673a56-9f93-f197-7423-000000000262 11579 1726882190.31443: done sending task result for task 12673a56-9f93-f197-7423-000000000262 11579 1726882190.31447: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11579 1726882190.31492: no more pending results, returning what we have 11579 1726882190.31498: results queue empty 11579 1726882190.31499: checking for any_errors_fatal 11579 1726882190.31505: done checking for any_errors_fatal 11579 1726882190.31505: checking for max_fail_percentage 11579 1726882190.31507: done checking for max_fail_percentage 11579 1726882190.31508: checking to see if all hosts have failed and the running result is not ok 11579 1726882190.31509: done checking to see if all hosts have failed 11579 1726882190.31509: getting the remaining hosts for this loop 11579 1726882190.31511: done getting the remaining hosts for this loop 11579 1726882190.31514: getting the next task for host managed_node1 11579 1726882190.31524: done getting next task for host managed_node1 11579 1726882190.31526: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11579 1726882190.31529: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882190.31533: getting variables 11579 1726882190.31534: in VariableManager get_vars() 11579 1726882190.31572: Calling all_inventory to load vars for managed_node1 11579 1726882190.31575: Calling groups_inventory to load vars for managed_node1 11579 1726882190.31578: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882190.31588: Calling all_plugins_play to load vars for managed_node1 11579 1726882190.31591: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882190.31598: Calling groups_plugins_play to load vars for managed_node1 11579 1726882190.32940: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882190.34487: done with get_vars() 11579 1726882190.34511: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:29:50 -0400 (0:00:00.054) 0:00:19.054 ****** 11579 1726882190.34605: entering _queue_task() for managed_node1/include_tasks 11579 1726882190.34877: worker is 1 (out of 1 available) 11579 1726882190.34890: exiting _queue_task() for managed_node1/include_tasks 11579 1726882190.35007: done queuing things up, now waiting for results queue to drain 11579 1726882190.35009: waiting for pending results... 11579 1726882190.35206: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 11579 1726882190.35302: in run() - task 12673a56-9f93-f197-7423-000000000266 11579 1726882190.35306: variable 'ansible_search_path' from source: unknown 11579 1726882190.35309: variable 'ansible_search_path' from source: unknown 11579 1726882190.35319: calling self._execute() 11579 1726882190.35409: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.35413: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.35425: variable 'omit' from source: magic vars 11579 1726882190.35772: variable 'ansible_distribution_major_version' from source: facts 11579 1726882190.35790: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882190.35800: _execute() done 11579 1726882190.35803: dumping result to json 11579 1726882190.35808: done dumping result, returning 11579 1726882190.35815: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-f197-7423-000000000266] 11579 1726882190.35820: sending task result for task 12673a56-9f93-f197-7423-000000000266 11579 1726882190.35906: done sending task result for task 12673a56-9f93-f197-7423-000000000266 11579 1726882190.35910: WORKER PROCESS EXITING 11579 1726882190.35937: no more pending results, returning what we have 11579 1726882190.35942: in VariableManager get_vars() 11579 1726882190.35990: Calling all_inventory to load vars for managed_node1 11579 1726882190.35997: Calling groups_inventory to load vars for managed_node1 11579 1726882190.36000: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882190.36013: Calling all_plugins_play to load vars for managed_node1 11579 1726882190.36015: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882190.36018: Calling groups_plugins_play to load vars for managed_node1 11579 1726882190.37573: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882190.39125: done with get_vars() 11579 1726882190.39145: variable 'ansible_search_path' from source: unknown 11579 1726882190.39146: variable 'ansible_search_path' from source: unknown 11579 1726882190.39181: we have included files to process 11579 1726882190.39183: generating all_blocks data 11579 1726882190.39184: done generating all_blocks data 11579 1726882190.39189: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11579 1726882190.39190: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11579 1726882190.39196: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11579 1726882190.40496: done processing included file 11579 1726882190.40600: iterating over new_blocks loaded from include file 11579 1726882190.40602: in VariableManager get_vars() 11579 1726882190.40622: done with get_vars() 11579 1726882190.40624: filtering new block on tags 11579 1726882190.40648: done filtering new block on tags 11579 1726882190.40651: in VariableManager get_vars() 11579 1726882190.40668: done with get_vars() 11579 1726882190.40669: filtering new block on tags 11579 1726882190.40688: done filtering new block on tags 11579 1726882190.40690: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 11579 1726882190.40801: extending task lists for all hosts with included blocks 11579 1726882190.41068: done extending task lists 11579 1726882190.41069: done processing included files 11579 1726882190.41070: results queue empty 11579 1726882190.41071: checking for any_errors_fatal 11579 1726882190.41074: done checking for any_errors_fatal 11579 1726882190.41074: checking for max_fail_percentage 11579 1726882190.41075: done checking for max_fail_percentage 11579 1726882190.41076: checking to see if all hosts have failed and the running result is not ok 11579 1726882190.41077: done checking to see if all hosts have failed 11579 1726882190.41077: getting the remaining hosts for this loop 11579 1726882190.41078: done getting the remaining hosts for this loop 11579 1726882190.41081: getting the next task for host managed_node1 11579 1726882190.41084: done getting next task for host managed_node1 11579 1726882190.41086: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11579 1726882190.41089: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882190.41091: getting variables 11579 1726882190.41092: in VariableManager get_vars() 11579 1726882190.41107: Calling all_inventory to load vars for managed_node1 11579 1726882190.41110: Calling groups_inventory to load vars for managed_node1 11579 1726882190.41112: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882190.41117: Calling all_plugins_play to load vars for managed_node1 11579 1726882190.41119: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882190.41121: Calling groups_plugins_play to load vars for managed_node1 11579 1726882190.42617: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882190.44718: done with get_vars() 11579 1726882190.44740: done getting variables 11579 1726882190.44783: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:29:50 -0400 (0:00:00.102) 0:00:19.156 ****** 11579 1726882190.44818: entering _queue_task() for managed_node1/set_fact 11579 1726882190.45162: worker is 1 (out of 1 available) 11579 1726882190.45174: exiting _queue_task() for managed_node1/set_fact 11579 1726882190.45186: done queuing things up, now waiting for results queue to drain 11579 1726882190.45187: waiting for pending results... 11579 1726882190.45612: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 11579 1726882190.45626: in run() - task 12673a56-9f93-f197-7423-0000000003f8 11579 1726882190.45630: variable 'ansible_search_path' from source: unknown 11579 1726882190.45633: variable 'ansible_search_path' from source: unknown 11579 1726882190.45635: calling self._execute() 11579 1726882190.45716: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.45720: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.45738: variable 'omit' from source: magic vars 11579 1726882190.46099: variable 'ansible_distribution_major_version' from source: facts 11579 1726882190.46113: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882190.46119: variable 'omit' from source: magic vars 11579 1726882190.46383: variable 'omit' from source: magic vars 11579 1726882190.46387: variable 'omit' from source: magic vars 11579 1726882190.46389: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882190.46392: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882190.46396: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882190.46398: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882190.46401: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882190.46403: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882190.46405: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.46407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.46473: Set connection var ansible_timeout to 10 11579 1726882190.46478: Set connection var ansible_shell_type to sh 11579 1726882190.46488: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882190.46490: Set connection var ansible_shell_executable to /bin/sh 11579 1726882190.46607: Set connection var ansible_pipelining to False 11579 1726882190.46611: Set connection var ansible_connection to ssh 11579 1726882190.46614: variable 'ansible_shell_executable' from source: unknown 11579 1726882190.46617: variable 'ansible_connection' from source: unknown 11579 1726882190.46620: variable 'ansible_module_compression' from source: unknown 11579 1726882190.46622: variable 'ansible_shell_type' from source: unknown 11579 1726882190.46624: variable 'ansible_shell_executable' from source: unknown 11579 1726882190.46626: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.46628: variable 'ansible_pipelining' from source: unknown 11579 1726882190.46631: variable 'ansible_timeout' from source: unknown 11579 1726882190.46633: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.46716: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882190.46721: variable 'omit' from source: magic vars 11579 1726882190.46723: starting attempt loop 11579 1726882190.46726: running the handler 11579 1726882190.46728: handler run complete 11579 1726882190.46730: attempt loop complete, returning result 11579 1726882190.46733: _execute() done 11579 1726882190.46735: dumping result to json 11579 1726882190.46737: done dumping result, returning 11579 1726882190.46746: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-f197-7423-0000000003f8] 11579 1726882190.46748: sending task result for task 12673a56-9f93-f197-7423-0000000003f8 11579 1726882190.46963: done sending task result for task 12673a56-9f93-f197-7423-0000000003f8 11579 1726882190.46967: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11579 1726882190.47019: no more pending results, returning what we have 11579 1726882190.47021: results queue empty 11579 1726882190.47022: checking for any_errors_fatal 11579 1726882190.47024: done checking for any_errors_fatal 11579 1726882190.47025: checking for max_fail_percentage 11579 1726882190.47026: done checking for max_fail_percentage 11579 1726882190.47027: checking to see if all hosts have failed and the running result is not ok 11579 1726882190.47028: done checking to see if all hosts have failed 11579 1726882190.47029: getting the remaining hosts for this loop 11579 1726882190.47031: done getting the remaining hosts for this loop 11579 1726882190.47034: getting the next task for host managed_node1 11579 1726882190.47040: done getting next task for host managed_node1 11579 1726882190.47043: ^ task is: TASK: Stat profile file 11579 1726882190.47046: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882190.47049: getting variables 11579 1726882190.47051: in VariableManager get_vars() 11579 1726882190.47092: Calling all_inventory to load vars for managed_node1 11579 1726882190.47099: Calling groups_inventory to load vars for managed_node1 11579 1726882190.47102: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882190.47113: Calling all_plugins_play to load vars for managed_node1 11579 1726882190.47116: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882190.47118: Calling groups_plugins_play to load vars for managed_node1 11579 1726882190.48585: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882190.50165: done with get_vars() 11579 1726882190.50191: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:29:50 -0400 (0:00:00.054) 0:00:19.211 ****** 11579 1726882190.50290: entering _queue_task() for managed_node1/stat 11579 1726882190.50643: worker is 1 (out of 1 available) 11579 1726882190.50657: exiting _queue_task() for managed_node1/stat 11579 1726882190.50668: done queuing things up, now waiting for results queue to drain 11579 1726882190.50669: waiting for pending results... 11579 1726882190.51111: running TaskExecutor() for managed_node1/TASK: Stat profile file 11579 1726882190.51121: in run() - task 12673a56-9f93-f197-7423-0000000003f9 11579 1726882190.51124: variable 'ansible_search_path' from source: unknown 11579 1726882190.51127: variable 'ansible_search_path' from source: unknown 11579 1726882190.51129: calling self._execute() 11579 1726882190.51402: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.51407: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.51410: variable 'omit' from source: magic vars 11579 1726882190.51604: variable 'ansible_distribution_major_version' from source: facts 11579 1726882190.51618: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882190.51625: variable 'omit' from source: magic vars 11579 1726882190.51674: variable 'omit' from source: magic vars 11579 1726882190.51765: variable 'profile' from source: include params 11579 1726882190.51769: variable 'item' from source: include params 11579 1726882190.52042: variable 'item' from source: include params 11579 1726882190.52060: variable 'omit' from source: magic vars 11579 1726882190.52107: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882190.52143: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882190.52164: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882190.52181: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882190.52194: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882190.52423: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882190.52426: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.52430: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.52552: Set connection var ansible_timeout to 10 11579 1726882190.52558: Set connection var ansible_shell_type to sh 11579 1726882190.52566: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882190.52571: Set connection var ansible_shell_executable to /bin/sh 11579 1726882190.52579: Set connection var ansible_pipelining to False 11579 1726882190.52582: Set connection var ansible_connection to ssh 11579 1726882190.52657: variable 'ansible_shell_executable' from source: unknown 11579 1726882190.52661: variable 'ansible_connection' from source: unknown 11579 1726882190.52664: variable 'ansible_module_compression' from source: unknown 11579 1726882190.52666: variable 'ansible_shell_type' from source: unknown 11579 1726882190.52669: variable 'ansible_shell_executable' from source: unknown 11579 1726882190.52671: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.52700: variable 'ansible_pipelining' from source: unknown 11579 1726882190.52704: variable 'ansible_timeout' from source: unknown 11579 1726882190.52707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.53099: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11579 1726882190.53172: variable 'omit' from source: magic vars 11579 1726882190.53178: starting attempt loop 11579 1726882190.53181: running the handler 11579 1726882190.53202: _low_level_execute_command(): starting 11579 1726882190.53209: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882190.54840: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882190.54894: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882190.56490: stdout chunk (state=3): >>>/root <<< 11579 1726882190.56750: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882190.56755: stdout chunk (state=3): >>><<< 11579 1726882190.56757: stderr chunk (state=3): >>><<< 11579 1726882190.56759: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882190.56761: _low_level_execute_command(): starting 11579 1726882190.56764: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882190.5665276-12468-21626528003541 `" && echo ansible-tmp-1726882190.5665276-12468-21626528003541="` echo /root/.ansible/tmp/ansible-tmp-1726882190.5665276-12468-21626528003541 `" ) && sleep 0' 11579 1726882190.57152: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882190.57165: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 11579 1726882190.57177: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882190.57280: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882190.57283: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882190.57344: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882190.59255: stdout chunk (state=3): >>>ansible-tmp-1726882190.5665276-12468-21626528003541=/root/.ansible/tmp/ansible-tmp-1726882190.5665276-12468-21626528003541 <<< 11579 1726882190.59380: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882190.59384: stdout chunk (state=3): >>><<< 11579 1726882190.59386: stderr chunk (state=3): >>><<< 11579 1726882190.59405: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882190.5665276-12468-21626528003541=/root/.ansible/tmp/ansible-tmp-1726882190.5665276-12468-21626528003541 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882190.59439: variable 'ansible_module_compression' from source: unknown 11579 1726882190.59481: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11579 1726882190.59516: variable 'ansible_facts' from source: unknown 11579 1726882190.59575: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882190.5665276-12468-21626528003541/AnsiballZ_stat.py 11579 1726882190.59671: Sending initial data 11579 1726882190.59675: Sent initial data (152 bytes) 11579 1726882190.60264: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882190.60279: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882190.60376: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882190.61884: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882190.61951: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882190.62012: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmp7m1m9mhf /root/.ansible/tmp/ansible-tmp-1726882190.5665276-12468-21626528003541/AnsiballZ_stat.py <<< 11579 1726882190.62015: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882190.5665276-12468-21626528003541/AnsiballZ_stat.py" <<< 11579 1726882190.62050: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmp7m1m9mhf" to remote "/root/.ansible/tmp/ansible-tmp-1726882190.5665276-12468-21626528003541/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882190.5665276-12468-21626528003541/AnsiballZ_stat.py" <<< 11579 1726882190.62962: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882190.63070: stderr chunk (state=3): >>><<< 11579 1726882190.63073: stdout chunk (state=3): >>><<< 11579 1726882190.63075: done transferring module to remote 11579 1726882190.63077: _low_level_execute_command(): starting 11579 1726882190.63079: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882190.5665276-12468-21626528003541/ /root/.ansible/tmp/ansible-tmp-1726882190.5665276-12468-21626528003541/AnsiballZ_stat.py && sleep 0' 11579 1726882190.64157: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882190.64161: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882190.64178: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882190.64184: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882190.64267: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 11579 1726882190.64276: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882190.64279: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882190.64509: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882190.66118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882190.66168: stderr chunk (state=3): >>><<< 11579 1726882190.66185: stdout chunk (state=3): >>><<< 11579 1726882190.66210: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882190.66223: _low_level_execute_command(): starting 11579 1726882190.66242: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882190.5665276-12468-21626528003541/AnsiballZ_stat.py && sleep 0' 11579 1726882190.66865: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882190.66878: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882190.66895: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882190.66914: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882190.66931: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882190.66952: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882190.66989: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882190.67086: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882190.67109: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882190.67127: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882190.67214: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882190.82069: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11579 1726882190.83264: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882190.83278: stderr chunk (state=3): >>><<< 11579 1726882190.83292: stdout chunk (state=3): >>><<< 11579 1726882190.83426: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.0", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882190.83431: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.0', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882190.5665276-12468-21626528003541/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882190.83433: _low_level_execute_command(): starting 11579 1726882190.83435: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882190.5665276-12468-21626528003541/ > /dev/null 2>&1 && sleep 0' 11579 1726882190.83833: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882190.83841: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882190.83848: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 11579 1726882190.83854: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882190.83872: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882190.83878: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 11579 1726882190.83889: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882190.83942: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882190.83945: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882190.83992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882190.85766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882190.85788: stderr chunk (state=3): >>><<< 11579 1726882190.85792: stdout chunk (state=3): >>><<< 11579 1726882190.85811: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882190.85818: handler run complete 11579 1726882190.85831: attempt loop complete, returning result 11579 1726882190.85834: _execute() done 11579 1726882190.85836: dumping result to json 11579 1726882190.85841: done dumping result, returning 11579 1726882190.85848: done running TaskExecutor() for managed_node1/TASK: Stat profile file [12673a56-9f93-f197-7423-0000000003f9] 11579 1726882190.85852: sending task result for task 12673a56-9f93-f197-7423-0000000003f9 11579 1726882190.85939: done sending task result for task 12673a56-9f93-f197-7423-0000000003f9 11579 1726882190.85942: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11579 1726882190.86000: no more pending results, returning what we have 11579 1726882190.86004: results queue empty 11579 1726882190.86005: checking for any_errors_fatal 11579 1726882190.86011: done checking for any_errors_fatal 11579 1726882190.86012: checking for max_fail_percentage 11579 1726882190.86014: done checking for max_fail_percentage 11579 1726882190.86014: checking to see if all hosts have failed and the running result is not ok 11579 1726882190.86015: done checking to see if all hosts have failed 11579 1726882190.86016: getting the remaining hosts for this loop 11579 1726882190.86018: done getting the remaining hosts for this loop 11579 1726882190.86022: getting the next task for host managed_node1 11579 1726882190.86029: done getting next task for host managed_node1 11579 1726882190.86031: ^ task is: TASK: Set NM profile exist flag based on the profile files 11579 1726882190.86035: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882190.86038: getting variables 11579 1726882190.86040: in VariableManager get_vars() 11579 1726882190.86085: Calling all_inventory to load vars for managed_node1 11579 1726882190.86087: Calling groups_inventory to load vars for managed_node1 11579 1726882190.86090: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882190.86104: Calling all_plugins_play to load vars for managed_node1 11579 1726882190.86107: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882190.86110: Calling groups_plugins_play to load vars for managed_node1 11579 1726882190.86984: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882190.94688: done with get_vars() 11579 1726882190.94711: done getting variables 11579 1726882190.94759: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:29:50 -0400 (0:00:00.444) 0:00:19.656 ****** 11579 1726882190.94791: entering _queue_task() for managed_node1/set_fact 11579 1726882190.95249: worker is 1 (out of 1 available) 11579 1726882190.95262: exiting _queue_task() for managed_node1/set_fact 11579 1726882190.95273: done queuing things up, now waiting for results queue to drain 11579 1726882190.95275: waiting for pending results... 11579 1726882190.95615: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 11579 1726882190.95627: in run() - task 12673a56-9f93-f197-7423-0000000003fa 11579 1726882190.95631: variable 'ansible_search_path' from source: unknown 11579 1726882190.95635: variable 'ansible_search_path' from source: unknown 11579 1726882190.95675: calling self._execute() 11579 1726882190.95787: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882190.95796: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882190.95807: variable 'omit' from source: magic vars 11579 1726882190.96224: variable 'ansible_distribution_major_version' from source: facts 11579 1726882190.96254: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882190.96364: variable 'profile_stat' from source: set_fact 11579 1726882190.96375: Evaluated conditional (profile_stat.stat.exists): False 11579 1726882190.96378: when evaluation is False, skipping this task 11579 1726882190.96381: _execute() done 11579 1726882190.96384: dumping result to json 11579 1726882190.96398: done dumping result, returning 11579 1726882190.96405: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-f197-7423-0000000003fa] 11579 1726882190.96410: sending task result for task 12673a56-9f93-f197-7423-0000000003fa 11579 1726882190.96584: done sending task result for task 12673a56-9f93-f197-7423-0000000003fa 11579 1726882190.96587: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11579 1726882190.96753: no more pending results, returning what we have 11579 1726882190.96756: results queue empty 11579 1726882190.96757: checking for any_errors_fatal 11579 1726882190.96764: done checking for any_errors_fatal 11579 1726882190.96765: checking for max_fail_percentage 11579 1726882190.96767: done checking for max_fail_percentage 11579 1726882190.96768: checking to see if all hosts have failed and the running result is not ok 11579 1726882190.96770: done checking to see if all hosts have failed 11579 1726882190.96770: getting the remaining hosts for this loop 11579 1726882190.96772: done getting the remaining hosts for this loop 11579 1726882190.96775: getting the next task for host managed_node1 11579 1726882190.96781: done getting next task for host managed_node1 11579 1726882190.96784: ^ task is: TASK: Get NM profile info 11579 1726882190.96787: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882190.96791: getting variables 11579 1726882190.96794: in VariableManager get_vars() 11579 1726882190.96835: Calling all_inventory to load vars for managed_node1 11579 1726882190.96838: Calling groups_inventory to load vars for managed_node1 11579 1726882190.96841: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882190.96852: Calling all_plugins_play to load vars for managed_node1 11579 1726882190.96855: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882190.96941: Calling groups_plugins_play to load vars for managed_node1 11579 1726882190.98229: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882190.99859: done with get_vars() 11579 1726882190.99880: done getting variables 11579 1726882190.99942: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:29:50 -0400 (0:00:00.051) 0:00:19.708 ****** 11579 1726882190.99974: entering _queue_task() for managed_node1/shell 11579 1726882191.00282: worker is 1 (out of 1 available) 11579 1726882191.00502: exiting _queue_task() for managed_node1/shell 11579 1726882191.00513: done queuing things up, now waiting for results queue to drain 11579 1726882191.00515: waiting for pending results... 11579 1726882191.00753: running TaskExecutor() for managed_node1/TASK: Get NM profile info 11579 1726882191.00757: in run() - task 12673a56-9f93-f197-7423-0000000003fb 11579 1726882191.00760: variable 'ansible_search_path' from source: unknown 11579 1726882191.00762: variable 'ansible_search_path' from source: unknown 11579 1726882191.00765: calling self._execute() 11579 1726882191.00873: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.00879: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.00890: variable 'omit' from source: magic vars 11579 1726882191.01308: variable 'ansible_distribution_major_version' from source: facts 11579 1726882191.01319: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882191.01325: variable 'omit' from source: magic vars 11579 1726882191.01379: variable 'omit' from source: magic vars 11579 1726882191.01486: variable 'profile' from source: include params 11579 1726882191.01491: variable 'item' from source: include params 11579 1726882191.01560: variable 'item' from source: include params 11579 1726882191.01579: variable 'omit' from source: magic vars 11579 1726882191.01632: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882191.01669: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882191.01689: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882191.01716: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882191.01729: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882191.01763: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882191.01766: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.01768: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.02300: Set connection var ansible_timeout to 10 11579 1726882191.02304: Set connection var ansible_shell_type to sh 11579 1726882191.02306: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882191.02308: Set connection var ansible_shell_executable to /bin/sh 11579 1726882191.02311: Set connection var ansible_pipelining to False 11579 1726882191.02313: Set connection var ansible_connection to ssh 11579 1726882191.02315: variable 'ansible_shell_executable' from source: unknown 11579 1726882191.02317: variable 'ansible_connection' from source: unknown 11579 1726882191.02319: variable 'ansible_module_compression' from source: unknown 11579 1726882191.02321: variable 'ansible_shell_type' from source: unknown 11579 1726882191.02323: variable 'ansible_shell_executable' from source: unknown 11579 1726882191.02325: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.02327: variable 'ansible_pipelining' from source: unknown 11579 1726882191.02330: variable 'ansible_timeout' from source: unknown 11579 1726882191.02332: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.02335: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882191.02337: variable 'omit' from source: magic vars 11579 1726882191.02339: starting attempt loop 11579 1726882191.02341: running the handler 11579 1726882191.02344: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882191.02346: _low_level_execute_command(): starting 11579 1726882191.02348: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882191.02920: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882191.02987: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882191.03014: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882191.03104: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882191.04691: stdout chunk (state=3): >>>/root <<< 11579 1726882191.04807: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882191.04848: stderr chunk (state=3): >>><<< 11579 1726882191.04867: stdout chunk (state=3): >>><<< 11579 1726882191.04897: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882191.04918: _low_level_execute_command(): starting 11579 1726882191.04929: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882191.0490558-12499-43028859077009 `" && echo ansible-tmp-1726882191.0490558-12499-43028859077009="` echo /root/.ansible/tmp/ansible-tmp-1726882191.0490558-12499-43028859077009 `" ) && sleep 0' 11579 1726882191.05544: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882191.05559: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882191.05581: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882191.05605: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882191.05625: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882191.05637: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882191.05710: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882191.05756: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882191.05775: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882191.05801: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882191.05881: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882191.07901: stdout chunk (state=3): >>>ansible-tmp-1726882191.0490558-12499-43028859077009=/root/.ansible/tmp/ansible-tmp-1726882191.0490558-12499-43028859077009 <<< 11579 1726882191.07910: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882191.07913: stdout chunk (state=3): >>><<< 11579 1726882191.07921: stderr chunk (state=3): >>><<< 11579 1726882191.07943: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882191.0490558-12499-43028859077009=/root/.ansible/tmp/ansible-tmp-1726882191.0490558-12499-43028859077009 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882191.07976: variable 'ansible_module_compression' from source: unknown 11579 1726882191.08028: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11579 1726882191.08099: variable 'ansible_facts' from source: unknown 11579 1726882191.08155: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882191.0490558-12499-43028859077009/AnsiballZ_command.py 11579 1726882191.08340: Sending initial data 11579 1726882191.08344: Sent initial data (155 bytes) 11579 1726882191.08997: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882191.09100: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882191.09104: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882191.09123: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882191.09134: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882191.09211: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882191.10709: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11579 1726882191.10713: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11579 1726882191.10735: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11579 1726882191.10738: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 <<< 11579 1726882191.10741: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882191.10773: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882191.10819: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpcknvuuw2 /root/.ansible/tmp/ansible-tmp-1726882191.0490558-12499-43028859077009/AnsiballZ_command.py <<< 11579 1726882191.10825: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882191.0490558-12499-43028859077009/AnsiballZ_command.py" <<< 11579 1726882191.10859: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpcknvuuw2" to remote "/root/.ansible/tmp/ansible-tmp-1726882191.0490558-12499-43028859077009/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882191.0490558-12499-43028859077009/AnsiballZ_command.py" <<< 11579 1726882191.11400: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882191.11433: stderr chunk (state=3): >>><<< 11579 1726882191.11436: stdout chunk (state=3): >>><<< 11579 1726882191.11453: done transferring module to remote 11579 1726882191.11460: _low_level_execute_command(): starting 11579 1726882191.11465: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882191.0490558-12499-43028859077009/ /root/.ansible/tmp/ansible-tmp-1726882191.0490558-12499-43028859077009/AnsiballZ_command.py && sleep 0' 11579 1726882191.12036: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882191.12063: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882191.12132: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882191.13855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882191.13862: stdout chunk (state=3): >>><<< 11579 1726882191.13869: stderr chunk (state=3): >>><<< 11579 1726882191.13899: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882191.13902: _low_level_execute_command(): starting 11579 1726882191.13905: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882191.0490558-12499-43028859077009/AnsiballZ_command.py && sleep 0' 11579 1726882191.14270: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882191.14306: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882191.14310: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 11579 1726882191.14312: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882191.14314: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882191.14316: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 11579 1726882191.14318: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882191.14365: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882191.14386: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882191.14425: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882191.31302: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:29:51.291185", "end": "2024-09-20 21:29:51.311427", "delta": "0:00:00.020242", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11579 1726882191.33003: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882191.33007: stderr chunk (state=3): >>><<< 11579 1726882191.33010: stdout chunk (state=3): >>><<< 11579 1726882191.33012: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "start": "2024-09-20 21:29:51.291185", "end": "2024-09-20 21:29:51.311427", "delta": "0:00:00.020242", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882191.33015: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882191.0490558-12499-43028859077009/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882191.33018: _low_level_execute_command(): starting 11579 1726882191.33020: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882191.0490558-12499-43028859077009/ > /dev/null 2>&1 && sleep 0' 11579 1726882191.33542: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882191.33550: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882191.33561: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882191.33574: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882191.33586: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882191.33599: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882191.33607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882191.33622: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882191.33630: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 11579 1726882191.33643: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11579 1726882191.33651: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882191.33661: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882191.33749: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882191.33765: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882191.33777: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882191.33842: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882191.35623: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882191.35678: stderr chunk (state=3): >>><<< 11579 1726882191.35692: stdout chunk (state=3): >>><<< 11579 1726882191.35719: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882191.35731: handler run complete 11579 1726882191.35784: Evaluated conditional (False): False 11579 1726882191.35787: attempt loop complete, returning result 11579 1726882191.35789: _execute() done 11579 1726882191.35791: dumping result to json 11579 1726882191.35796: done dumping result, returning 11579 1726882191.35890: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [12673a56-9f93-f197-7423-0000000003fb] 11579 1726882191.35896: sending task result for task 12673a56-9f93-f197-7423-0000000003fb 11579 1726882191.35965: done sending task result for task 12673a56-9f93-f197-7423-0000000003fb 11579 1726882191.35967: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.0 | grep /etc", "delta": "0:00:00.020242", "end": "2024-09-20 21:29:51.311427", "rc": 0, "start": "2024-09-20 21:29:51.291185" } STDOUT: bond0.0 /etc/NetworkManager/system-connections/bond0.0.nmconnection 11579 1726882191.36157: no more pending results, returning what we have 11579 1726882191.36160: results queue empty 11579 1726882191.36161: checking for any_errors_fatal 11579 1726882191.36165: done checking for any_errors_fatal 11579 1726882191.36166: checking for max_fail_percentage 11579 1726882191.36167: done checking for max_fail_percentage 11579 1726882191.36168: checking to see if all hosts have failed and the running result is not ok 11579 1726882191.36169: done checking to see if all hosts have failed 11579 1726882191.36170: getting the remaining hosts for this loop 11579 1726882191.36171: done getting the remaining hosts for this loop 11579 1726882191.36174: getting the next task for host managed_node1 11579 1726882191.36181: done getting next task for host managed_node1 11579 1726882191.36183: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11579 1726882191.36186: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882191.36189: getting variables 11579 1726882191.36191: in VariableManager get_vars() 11579 1726882191.36235: Calling all_inventory to load vars for managed_node1 11579 1726882191.36238: Calling groups_inventory to load vars for managed_node1 11579 1726882191.36240: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882191.36249: Calling all_plugins_play to load vars for managed_node1 11579 1726882191.36252: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882191.36254: Calling groups_plugins_play to load vars for managed_node1 11579 1726882191.37878: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882191.40097: done with get_vars() 11579 1726882191.40119: done getting variables 11579 1726882191.40180: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:29:51 -0400 (0:00:00.402) 0:00:20.110 ****** 11579 1726882191.40217: entering _queue_task() for managed_node1/set_fact 11579 1726882191.40631: worker is 1 (out of 1 available) 11579 1726882191.40641: exiting _queue_task() for managed_node1/set_fact 11579 1726882191.40651: done queuing things up, now waiting for results queue to drain 11579 1726882191.40652: waiting for pending results... 11579 1726882191.41013: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11579 1726882191.41018: in run() - task 12673a56-9f93-f197-7423-0000000003fc 11579 1726882191.41021: variable 'ansible_search_path' from source: unknown 11579 1726882191.41024: variable 'ansible_search_path' from source: unknown 11579 1726882191.41053: calling self._execute() 11579 1726882191.41166: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.41177: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.41192: variable 'omit' from source: magic vars 11579 1726882191.41638: variable 'ansible_distribution_major_version' from source: facts 11579 1726882191.41661: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882191.41803: variable 'nm_profile_exists' from source: set_fact 11579 1726882191.41822: Evaluated conditional (nm_profile_exists.rc == 0): True 11579 1726882191.41833: variable 'omit' from source: magic vars 11579 1726882191.41888: variable 'omit' from source: magic vars 11579 1726882191.41929: variable 'omit' from source: magic vars 11579 1726882191.41969: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882191.42016: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882191.42043: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882191.42091: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882191.42099: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882191.42123: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882191.42132: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.42201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.42242: Set connection var ansible_timeout to 10 11579 1726882191.42251: Set connection var ansible_shell_type to sh 11579 1726882191.42260: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882191.42266: Set connection var ansible_shell_executable to /bin/sh 11579 1726882191.42274: Set connection var ansible_pipelining to False 11579 1726882191.42279: Set connection var ansible_connection to ssh 11579 1726882191.42310: variable 'ansible_shell_executable' from source: unknown 11579 1726882191.42319: variable 'ansible_connection' from source: unknown 11579 1726882191.42326: variable 'ansible_module_compression' from source: unknown 11579 1726882191.42333: variable 'ansible_shell_type' from source: unknown 11579 1726882191.42340: variable 'ansible_shell_executable' from source: unknown 11579 1726882191.42347: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.42355: variable 'ansible_pipelining' from source: unknown 11579 1726882191.42363: variable 'ansible_timeout' from source: unknown 11579 1726882191.42371: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.42527: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882191.42535: variable 'omit' from source: magic vars 11579 1726882191.42635: starting attempt loop 11579 1726882191.42639: running the handler 11579 1726882191.42642: handler run complete 11579 1726882191.42644: attempt loop complete, returning result 11579 1726882191.42646: _execute() done 11579 1726882191.42648: dumping result to json 11579 1726882191.42650: done dumping result, returning 11579 1726882191.42653: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-f197-7423-0000000003fc] 11579 1726882191.42655: sending task result for task 12673a56-9f93-f197-7423-0000000003fc 11579 1726882191.42728: done sending task result for task 12673a56-9f93-f197-7423-0000000003fc 11579 1726882191.42731: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11579 1726882191.42784: no more pending results, returning what we have 11579 1726882191.42787: results queue empty 11579 1726882191.42788: checking for any_errors_fatal 11579 1726882191.42798: done checking for any_errors_fatal 11579 1726882191.42798: checking for max_fail_percentage 11579 1726882191.42800: done checking for max_fail_percentage 11579 1726882191.42801: checking to see if all hosts have failed and the running result is not ok 11579 1726882191.42802: done checking to see if all hosts have failed 11579 1726882191.42803: getting the remaining hosts for this loop 11579 1726882191.42804: done getting the remaining hosts for this loop 11579 1726882191.42807: getting the next task for host managed_node1 11579 1726882191.42816: done getting next task for host managed_node1 11579 1726882191.42818: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11579 1726882191.42823: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882191.42826: getting variables 11579 1726882191.42828: in VariableManager get_vars() 11579 1726882191.42868: Calling all_inventory to load vars for managed_node1 11579 1726882191.42872: Calling groups_inventory to load vars for managed_node1 11579 1726882191.42874: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882191.42887: Calling all_plugins_play to load vars for managed_node1 11579 1726882191.42890: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882191.43001: Calling groups_plugins_play to load vars for managed_node1 11579 1726882191.44387: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882191.45965: done with get_vars() 11579 1726882191.45984: done getting variables 11579 1726882191.46042: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882191.46155: variable 'profile' from source: include params 11579 1726882191.46159: variable 'item' from source: include params 11579 1726882191.46219: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.0] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:29:51 -0400 (0:00:00.060) 0:00:20.170 ****** 11579 1726882191.46253: entering _queue_task() for managed_node1/command 11579 1726882191.46515: worker is 1 (out of 1 available) 11579 1726882191.46527: exiting _queue_task() for managed_node1/command 11579 1726882191.46537: done queuing things up, now waiting for results queue to drain 11579 1726882191.46539: waiting for pending results... 11579 1726882191.46815: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 11579 1726882191.46926: in run() - task 12673a56-9f93-f197-7423-0000000003fe 11579 1726882191.46948: variable 'ansible_search_path' from source: unknown 11579 1726882191.46955: variable 'ansible_search_path' from source: unknown 11579 1726882191.47000: calling self._execute() 11579 1726882191.47096: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.47126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.47129: variable 'omit' from source: magic vars 11579 1726882191.47560: variable 'ansible_distribution_major_version' from source: facts 11579 1726882191.47564: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882191.47619: variable 'profile_stat' from source: set_fact 11579 1726882191.47637: Evaluated conditional (profile_stat.stat.exists): False 11579 1726882191.47643: when evaluation is False, skipping this task 11579 1726882191.47650: _execute() done 11579 1726882191.47657: dumping result to json 11579 1726882191.47667: done dumping result, returning 11579 1726882191.47677: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.0 [12673a56-9f93-f197-7423-0000000003fe] 11579 1726882191.47686: sending task result for task 12673a56-9f93-f197-7423-0000000003fe 11579 1726882191.47906: done sending task result for task 12673a56-9f93-f197-7423-0000000003fe 11579 1726882191.47908: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11579 1726882191.47954: no more pending results, returning what we have 11579 1726882191.47957: results queue empty 11579 1726882191.47957: checking for any_errors_fatal 11579 1726882191.47965: done checking for any_errors_fatal 11579 1726882191.47965: checking for max_fail_percentage 11579 1726882191.47967: done checking for max_fail_percentage 11579 1726882191.47968: checking to see if all hosts have failed and the running result is not ok 11579 1726882191.47969: done checking to see if all hosts have failed 11579 1726882191.47970: getting the remaining hosts for this loop 11579 1726882191.47971: done getting the remaining hosts for this loop 11579 1726882191.47974: getting the next task for host managed_node1 11579 1726882191.47981: done getting next task for host managed_node1 11579 1726882191.47983: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11579 1726882191.47986: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882191.47990: getting variables 11579 1726882191.47992: in VariableManager get_vars() 11579 1726882191.48035: Calling all_inventory to load vars for managed_node1 11579 1726882191.48038: Calling groups_inventory to load vars for managed_node1 11579 1726882191.48041: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882191.48052: Calling all_plugins_play to load vars for managed_node1 11579 1726882191.48055: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882191.48057: Calling groups_plugins_play to load vars for managed_node1 11579 1726882191.49548: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882191.51484: done with get_vars() 11579 1726882191.51520: done getting variables 11579 1726882191.51581: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882191.51708: variable 'profile' from source: include params 11579 1726882191.51712: variable 'item' from source: include params 11579 1726882191.51778: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.0] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:29:51 -0400 (0:00:00.055) 0:00:20.226 ****** 11579 1726882191.51813: entering _queue_task() for managed_node1/set_fact 11579 1726882191.52162: worker is 1 (out of 1 available) 11579 1726882191.52173: exiting _queue_task() for managed_node1/set_fact 11579 1726882191.52184: done queuing things up, now waiting for results queue to drain 11579 1726882191.52185: waiting for pending results... 11579 1726882191.52913: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 11579 1726882191.52921: in run() - task 12673a56-9f93-f197-7423-0000000003ff 11579 1726882191.52925: variable 'ansible_search_path' from source: unknown 11579 1726882191.52932: variable 'ansible_search_path' from source: unknown 11579 1726882191.53300: calling self._execute() 11579 1726882191.53304: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.53307: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.53309: variable 'omit' from source: magic vars 11579 1726882191.53935: variable 'ansible_distribution_major_version' from source: facts 11579 1726882191.54089: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882191.54341: variable 'profile_stat' from source: set_fact 11579 1726882191.54455: Evaluated conditional (profile_stat.stat.exists): False 11579 1726882191.54465: when evaluation is False, skipping this task 11579 1726882191.54472: _execute() done 11579 1726882191.54479: dumping result to json 11579 1726882191.54486: done dumping result, returning 11579 1726882191.54501: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.0 [12673a56-9f93-f197-7423-0000000003ff] 11579 1726882191.54512: sending task result for task 12673a56-9f93-f197-7423-0000000003ff skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11579 1726882191.54672: no more pending results, returning what we have 11579 1726882191.54675: results queue empty 11579 1726882191.54676: checking for any_errors_fatal 11579 1726882191.54682: done checking for any_errors_fatal 11579 1726882191.54683: checking for max_fail_percentage 11579 1726882191.54685: done checking for max_fail_percentage 11579 1726882191.54686: checking to see if all hosts have failed and the running result is not ok 11579 1726882191.54687: done checking to see if all hosts have failed 11579 1726882191.54687: getting the remaining hosts for this loop 11579 1726882191.54689: done getting the remaining hosts for this loop 11579 1726882191.54692: getting the next task for host managed_node1 11579 1726882191.54702: done getting next task for host managed_node1 11579 1726882191.54704: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11579 1726882191.54708: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882191.54712: getting variables 11579 1726882191.54714: in VariableManager get_vars() 11579 1726882191.54756: Calling all_inventory to load vars for managed_node1 11579 1726882191.54759: Calling groups_inventory to load vars for managed_node1 11579 1726882191.54761: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882191.54777: Calling all_plugins_play to load vars for managed_node1 11579 1726882191.54781: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882191.54785: Calling groups_plugins_play to load vars for managed_node1 11579 1726882191.55957: done sending task result for task 12673a56-9f93-f197-7423-0000000003ff 11579 1726882191.55963: WORKER PROCESS EXITING 11579 1726882191.56927: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882191.58751: done with get_vars() 11579 1726882191.58775: done getting variables 11579 1726882191.58846: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882191.58967: variable 'profile' from source: include params 11579 1726882191.58971: variable 'item' from source: include params 11579 1726882191.59032: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.0] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:29:51 -0400 (0:00:00.072) 0:00:20.299 ****** 11579 1726882191.59068: entering _queue_task() for managed_node1/command 11579 1726882191.59535: worker is 1 (out of 1 available) 11579 1726882191.59550: exiting _queue_task() for managed_node1/command 11579 1726882191.59560: done queuing things up, now waiting for results queue to drain 11579 1726882191.59562: waiting for pending results... 11579 1726882191.59849: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 11579 1726882191.59984: in run() - task 12673a56-9f93-f197-7423-000000000400 11579 1726882191.60011: variable 'ansible_search_path' from source: unknown 11579 1726882191.60029: variable 'ansible_search_path' from source: unknown 11579 1726882191.60072: calling self._execute() 11579 1726882191.60185: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.60201: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.60219: variable 'omit' from source: magic vars 11579 1726882191.60622: variable 'ansible_distribution_major_version' from source: facts 11579 1726882191.60642: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882191.60782: variable 'profile_stat' from source: set_fact 11579 1726882191.60800: Evaluated conditional (profile_stat.stat.exists): False 11579 1726882191.60891: when evaluation is False, skipping this task 11579 1726882191.60899: _execute() done 11579 1726882191.60902: dumping result to json 11579 1726882191.60905: done dumping result, returning 11579 1726882191.60908: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.0 [12673a56-9f93-f197-7423-000000000400] 11579 1726882191.60910: sending task result for task 12673a56-9f93-f197-7423-000000000400 11579 1726882191.60978: done sending task result for task 12673a56-9f93-f197-7423-000000000400 11579 1726882191.60981: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11579 1726882191.61046: no more pending results, returning what we have 11579 1726882191.61050: results queue empty 11579 1726882191.61051: checking for any_errors_fatal 11579 1726882191.61059: done checking for any_errors_fatal 11579 1726882191.61060: checking for max_fail_percentage 11579 1726882191.61061: done checking for max_fail_percentage 11579 1726882191.61062: checking to see if all hosts have failed and the running result is not ok 11579 1726882191.61064: done checking to see if all hosts have failed 11579 1726882191.61064: getting the remaining hosts for this loop 11579 1726882191.61066: done getting the remaining hosts for this loop 11579 1726882191.61070: getting the next task for host managed_node1 11579 1726882191.61080: done getting next task for host managed_node1 11579 1726882191.61083: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11579 1726882191.61088: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882191.61097: getting variables 11579 1726882191.61099: in VariableManager get_vars() 11579 1726882191.61148: Calling all_inventory to load vars for managed_node1 11579 1726882191.61151: Calling groups_inventory to load vars for managed_node1 11579 1726882191.61153: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882191.61171: Calling all_plugins_play to load vars for managed_node1 11579 1726882191.61175: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882191.61178: Calling groups_plugins_play to load vars for managed_node1 11579 1726882191.62763: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882191.64182: done with get_vars() 11579 1726882191.64201: done getting variables 11579 1726882191.64243: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882191.64333: variable 'profile' from source: include params 11579 1726882191.64338: variable 'item' from source: include params 11579 1726882191.64398: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.0] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:29:51 -0400 (0:00:00.053) 0:00:20.352 ****** 11579 1726882191.64432: entering _queue_task() for managed_node1/set_fact 11579 1726882191.64922: worker is 1 (out of 1 available) 11579 1726882191.64930: exiting _queue_task() for managed_node1/set_fact 11579 1726882191.64942: done queuing things up, now waiting for results queue to drain 11579 1726882191.64944: waiting for pending results... 11579 1726882191.65075: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 11579 1726882191.65147: in run() - task 12673a56-9f93-f197-7423-000000000401 11579 1726882191.65161: variable 'ansible_search_path' from source: unknown 11579 1726882191.65166: variable 'ansible_search_path' from source: unknown 11579 1726882191.65212: calling self._execute() 11579 1726882191.65303: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.65306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.65317: variable 'omit' from source: magic vars 11579 1726882191.65691: variable 'ansible_distribution_major_version' from source: facts 11579 1726882191.65703: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882191.65805: variable 'profile_stat' from source: set_fact 11579 1726882191.65814: Evaluated conditional (profile_stat.stat.exists): False 11579 1726882191.65818: when evaluation is False, skipping this task 11579 1726882191.65820: _execute() done 11579 1726882191.65823: dumping result to json 11579 1726882191.65826: done dumping result, returning 11579 1726882191.65911: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.0 [12673a56-9f93-f197-7423-000000000401] 11579 1726882191.65914: sending task result for task 12673a56-9f93-f197-7423-000000000401 11579 1726882191.65974: done sending task result for task 12673a56-9f93-f197-7423-000000000401 11579 1726882191.65977: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11579 1726882191.66066: no more pending results, returning what we have 11579 1726882191.66068: results queue empty 11579 1726882191.66069: checking for any_errors_fatal 11579 1726882191.66072: done checking for any_errors_fatal 11579 1726882191.66073: checking for max_fail_percentage 11579 1726882191.66075: done checking for max_fail_percentage 11579 1726882191.66076: checking to see if all hosts have failed and the running result is not ok 11579 1726882191.66076: done checking to see if all hosts have failed 11579 1726882191.66077: getting the remaining hosts for this loop 11579 1726882191.66078: done getting the remaining hosts for this loop 11579 1726882191.66081: getting the next task for host managed_node1 11579 1726882191.66087: done getting next task for host managed_node1 11579 1726882191.66089: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11579 1726882191.66092: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882191.66175: getting variables 11579 1726882191.66177: in VariableManager get_vars() 11579 1726882191.66214: Calling all_inventory to load vars for managed_node1 11579 1726882191.66217: Calling groups_inventory to load vars for managed_node1 11579 1726882191.66219: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882191.66229: Calling all_plugins_play to load vars for managed_node1 11579 1726882191.66231: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882191.66234: Calling groups_plugins_play to load vars for managed_node1 11579 1726882191.67707: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882191.69850: done with get_vars() 11579 1726882191.69872: done getting variables 11579 1726882191.69935: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882191.70348: variable 'profile' from source: include params 11579 1726882191.70352: variable 'item' from source: include params 11579 1726882191.70418: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.0'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:29:51 -0400 (0:00:00.060) 0:00:20.412 ****** 11579 1726882191.70450: entering _queue_task() for managed_node1/assert 11579 1726882191.70880: worker is 1 (out of 1 available) 11579 1726882191.70898: exiting _queue_task() for managed_node1/assert 11579 1726882191.70911: done queuing things up, now waiting for results queue to drain 11579 1726882191.70912: waiting for pending results... 11579 1726882191.71314: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.0' 11579 1726882191.71325: in run() - task 12673a56-9f93-f197-7423-000000000267 11579 1726882191.71329: variable 'ansible_search_path' from source: unknown 11579 1726882191.71331: variable 'ansible_search_path' from source: unknown 11579 1726882191.71340: calling self._execute() 11579 1726882191.71449: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.71461: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.71475: variable 'omit' from source: magic vars 11579 1726882191.71842: variable 'ansible_distribution_major_version' from source: facts 11579 1726882191.71868: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882191.71880: variable 'omit' from source: magic vars 11579 1726882191.71923: variable 'omit' from source: magic vars 11579 1726882191.72032: variable 'profile' from source: include params 11579 1726882191.72087: variable 'item' from source: include params 11579 1726882191.72118: variable 'item' from source: include params 11579 1726882191.72142: variable 'omit' from source: magic vars 11579 1726882191.72195: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882191.72238: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882191.72263: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882191.72287: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882191.72399: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882191.72404: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882191.72406: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.72409: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.72473: Set connection var ansible_timeout to 10 11579 1726882191.72487: Set connection var ansible_shell_type to sh 11579 1726882191.72503: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882191.72514: Set connection var ansible_shell_executable to /bin/sh 11579 1726882191.72533: Set connection var ansible_pipelining to False 11579 1726882191.72540: Set connection var ansible_connection to ssh 11579 1726882191.72563: variable 'ansible_shell_executable' from source: unknown 11579 1726882191.72570: variable 'ansible_connection' from source: unknown 11579 1726882191.72576: variable 'ansible_module_compression' from source: unknown 11579 1726882191.72582: variable 'ansible_shell_type' from source: unknown 11579 1726882191.72588: variable 'ansible_shell_executable' from source: unknown 11579 1726882191.72638: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.72641: variable 'ansible_pipelining' from source: unknown 11579 1726882191.72643: variable 'ansible_timeout' from source: unknown 11579 1726882191.72645: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.72762: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882191.72777: variable 'omit' from source: magic vars 11579 1726882191.72786: starting attempt loop 11579 1726882191.72791: running the handler 11579 1726882191.72909: variable 'lsr_net_profile_exists' from source: set_fact 11579 1726882191.72963: Evaluated conditional (lsr_net_profile_exists): True 11579 1726882191.72966: handler run complete 11579 1726882191.72968: attempt loop complete, returning result 11579 1726882191.72970: _execute() done 11579 1726882191.72972: dumping result to json 11579 1726882191.72974: done dumping result, returning 11579 1726882191.72975: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.0' [12673a56-9f93-f197-7423-000000000267] 11579 1726882191.72979: sending task result for task 12673a56-9f93-f197-7423-000000000267 ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11579 1726882191.73120: no more pending results, returning what we have 11579 1726882191.73123: results queue empty 11579 1726882191.73124: checking for any_errors_fatal 11579 1726882191.73130: done checking for any_errors_fatal 11579 1726882191.73131: checking for max_fail_percentage 11579 1726882191.73132: done checking for max_fail_percentage 11579 1726882191.73133: checking to see if all hosts have failed and the running result is not ok 11579 1726882191.73134: done checking to see if all hosts have failed 11579 1726882191.73135: getting the remaining hosts for this loop 11579 1726882191.73136: done getting the remaining hosts for this loop 11579 1726882191.73139: getting the next task for host managed_node1 11579 1726882191.73145: done getting next task for host managed_node1 11579 1726882191.73147: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11579 1726882191.73149: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882191.73153: getting variables 11579 1726882191.73154: in VariableManager get_vars() 11579 1726882191.73199: Calling all_inventory to load vars for managed_node1 11579 1726882191.73202: Calling groups_inventory to load vars for managed_node1 11579 1726882191.73205: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882191.73217: Calling all_plugins_play to load vars for managed_node1 11579 1726882191.73221: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882191.73224: Calling groups_plugins_play to load vars for managed_node1 11579 1726882191.73804: done sending task result for task 12673a56-9f93-f197-7423-000000000267 11579 1726882191.73807: WORKER PROCESS EXITING 11579 1726882191.74757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882191.76446: done with get_vars() 11579 1726882191.76474: done getting variables 11579 1726882191.76549: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882191.76679: variable 'profile' from source: include params 11579 1726882191.76683: variable 'item' from source: include params 11579 1726882191.76755: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.0'] ********* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:29:51 -0400 (0:00:00.063) 0:00:20.476 ****** 11579 1726882191.76797: entering _queue_task() for managed_node1/assert 11579 1726882191.77161: worker is 1 (out of 1 available) 11579 1726882191.77173: exiting _queue_task() for managed_node1/assert 11579 1726882191.77405: done queuing things up, now waiting for results queue to drain 11579 1726882191.77407: waiting for pending results... 11579 1726882191.77458: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' 11579 1726882191.77571: in run() - task 12673a56-9f93-f197-7423-000000000268 11579 1726882191.77591: variable 'ansible_search_path' from source: unknown 11579 1726882191.77602: variable 'ansible_search_path' from source: unknown 11579 1726882191.77653: calling self._execute() 11579 1726882191.77769: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.77780: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.77797: variable 'omit' from source: magic vars 11579 1726882191.78206: variable 'ansible_distribution_major_version' from source: facts 11579 1726882191.78226: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882191.78238: variable 'omit' from source: magic vars 11579 1726882191.78289: variable 'omit' from source: magic vars 11579 1726882191.78397: variable 'profile' from source: include params 11579 1726882191.78408: variable 'item' from source: include params 11579 1726882191.78472: variable 'item' from source: include params 11579 1726882191.78506: variable 'omit' from source: magic vars 11579 1726882191.78551: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882191.78600: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882191.78627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882191.78648: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882191.78662: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882191.78697: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882191.78821: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.78825: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.78830: Set connection var ansible_timeout to 10 11579 1726882191.78842: Set connection var ansible_shell_type to sh 11579 1726882191.78855: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882191.78864: Set connection var ansible_shell_executable to /bin/sh 11579 1726882191.78876: Set connection var ansible_pipelining to False 11579 1726882191.78883: Set connection var ansible_connection to ssh 11579 1726882191.78914: variable 'ansible_shell_executable' from source: unknown 11579 1726882191.78928: variable 'ansible_connection' from source: unknown 11579 1726882191.78939: variable 'ansible_module_compression' from source: unknown 11579 1726882191.79001: variable 'ansible_shell_type' from source: unknown 11579 1726882191.79004: variable 'ansible_shell_executable' from source: unknown 11579 1726882191.79007: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.79009: variable 'ansible_pipelining' from source: unknown 11579 1726882191.79011: variable 'ansible_timeout' from source: unknown 11579 1726882191.79014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.79116: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882191.79134: variable 'omit' from source: magic vars 11579 1726882191.79162: starting attempt loop 11579 1726882191.79172: running the handler 11579 1726882191.79308: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11579 1726882191.79325: Evaluated conditional (lsr_net_profile_ansible_managed): True 11579 1726882191.79364: handler run complete 11579 1726882191.79372: attempt loop complete, returning result 11579 1726882191.79374: _execute() done 11579 1726882191.79377: dumping result to json 11579 1726882191.79383: done dumping result, returning 11579 1726882191.79401: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.0' [12673a56-9f93-f197-7423-000000000268] 11579 1726882191.79473: sending task result for task 12673a56-9f93-f197-7423-000000000268 11579 1726882191.79544: done sending task result for task 12673a56-9f93-f197-7423-000000000268 11579 1726882191.79547: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11579 1726882191.79638: no more pending results, returning what we have 11579 1726882191.79642: results queue empty 11579 1726882191.79643: checking for any_errors_fatal 11579 1726882191.79649: done checking for any_errors_fatal 11579 1726882191.79650: checking for max_fail_percentage 11579 1726882191.79652: done checking for max_fail_percentage 11579 1726882191.79653: checking to see if all hosts have failed and the running result is not ok 11579 1726882191.79654: done checking to see if all hosts have failed 11579 1726882191.79654: getting the remaining hosts for this loop 11579 1726882191.79656: done getting the remaining hosts for this loop 11579 1726882191.79659: getting the next task for host managed_node1 11579 1726882191.79666: done getting next task for host managed_node1 11579 1726882191.79669: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11579 1726882191.79672: ^ state is: HOST STATE: block=2, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882191.79677: getting variables 11579 1726882191.79679: in VariableManager get_vars() 11579 1726882191.79734: Calling all_inventory to load vars for managed_node1 11579 1726882191.79737: Calling groups_inventory to load vars for managed_node1 11579 1726882191.79740: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882191.79752: Calling all_plugins_play to load vars for managed_node1 11579 1726882191.79755: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882191.79757: Calling groups_plugins_play to load vars for managed_node1 11579 1726882191.81064: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882191.81932: done with get_vars() 11579 1726882191.81953: done getting variables 11579 1726882191.82021: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882191.82133: variable 'profile' from source: include params 11579 1726882191.82136: variable 'item' from source: include params 11579 1726882191.82187: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.0] *************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:29:51 -0400 (0:00:00.054) 0:00:20.530 ****** 11579 1726882191.82224: entering _queue_task() for managed_node1/assert 11579 1726882191.82536: worker is 1 (out of 1 available) 11579 1726882191.82548: exiting _queue_task() for managed_node1/assert 11579 1726882191.82558: done queuing things up, now waiting for results queue to drain 11579 1726882191.82563: waiting for pending results... 11579 1726882191.83014: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.0 11579 1726882191.83020: in run() - task 12673a56-9f93-f197-7423-000000000269 11579 1726882191.83023: variable 'ansible_search_path' from source: unknown 11579 1726882191.83026: variable 'ansible_search_path' from source: unknown 11579 1726882191.83029: calling self._execute() 11579 1726882191.83200: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.83204: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.83206: variable 'omit' from source: magic vars 11579 1726882191.83464: variable 'ansible_distribution_major_version' from source: facts 11579 1726882191.83477: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882191.83482: variable 'omit' from source: magic vars 11579 1726882191.83522: variable 'omit' from source: magic vars 11579 1726882191.83649: variable 'profile' from source: include params 11579 1726882191.83653: variable 'item' from source: include params 11579 1726882191.83710: variable 'item' from source: include params 11579 1726882191.83728: variable 'omit' from source: magic vars 11579 1726882191.83791: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882191.83838: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882191.83856: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882191.83875: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882191.83884: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882191.83913: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882191.83917: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.83919: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.84002: Set connection var ansible_timeout to 10 11579 1726882191.84009: Set connection var ansible_shell_type to sh 11579 1726882191.84015: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882191.84021: Set connection var ansible_shell_executable to /bin/sh 11579 1726882191.84027: Set connection var ansible_pipelining to False 11579 1726882191.84030: Set connection var ansible_connection to ssh 11579 1726882191.84045: variable 'ansible_shell_executable' from source: unknown 11579 1726882191.84047: variable 'ansible_connection' from source: unknown 11579 1726882191.84050: variable 'ansible_module_compression' from source: unknown 11579 1726882191.84052: variable 'ansible_shell_type' from source: unknown 11579 1726882191.84054: variable 'ansible_shell_executable' from source: unknown 11579 1726882191.84057: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.84061: variable 'ansible_pipelining' from source: unknown 11579 1726882191.84066: variable 'ansible_timeout' from source: unknown 11579 1726882191.84068: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.84163: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882191.84172: variable 'omit' from source: magic vars 11579 1726882191.84178: starting attempt loop 11579 1726882191.84181: running the handler 11579 1726882191.84256: variable 'lsr_net_profile_fingerprint' from source: set_fact 11579 1726882191.84259: Evaluated conditional (lsr_net_profile_fingerprint): True 11579 1726882191.84265: handler run complete 11579 1726882191.84276: attempt loop complete, returning result 11579 1726882191.84278: _execute() done 11579 1726882191.84281: dumping result to json 11579 1726882191.84284: done dumping result, returning 11579 1726882191.84299: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.0 [12673a56-9f93-f197-7423-000000000269] 11579 1726882191.84301: sending task result for task 12673a56-9f93-f197-7423-000000000269 11579 1726882191.84376: done sending task result for task 12673a56-9f93-f197-7423-000000000269 11579 1726882191.84379: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11579 1726882191.84452: no more pending results, returning what we have 11579 1726882191.84455: results queue empty 11579 1726882191.84456: checking for any_errors_fatal 11579 1726882191.84462: done checking for any_errors_fatal 11579 1726882191.84463: checking for max_fail_percentage 11579 1726882191.84464: done checking for max_fail_percentage 11579 1726882191.84465: checking to see if all hosts have failed and the running result is not ok 11579 1726882191.84466: done checking to see if all hosts have failed 11579 1726882191.84467: getting the remaining hosts for this loop 11579 1726882191.84468: done getting the remaining hosts for this loop 11579 1726882191.84471: getting the next task for host managed_node1 11579 1726882191.84478: done getting next task for host managed_node1 11579 1726882191.84481: ^ task is: TASK: Include the task 'get_profile_stat.yml' 11579 1726882191.84483: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882191.84487: getting variables 11579 1726882191.84488: in VariableManager get_vars() 11579 1726882191.84524: Calling all_inventory to load vars for managed_node1 11579 1726882191.84527: Calling groups_inventory to load vars for managed_node1 11579 1726882191.84529: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882191.84538: Calling all_plugins_play to load vars for managed_node1 11579 1726882191.84540: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882191.84543: Calling groups_plugins_play to load vars for managed_node1 11579 1726882191.85291: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882191.86605: done with get_vars() 11579 1726882191.86621: done getting variables TASK [Include the task 'get_profile_stat.yml'] ********************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:3 Friday 20 September 2024 21:29:51 -0400 (0:00:00.044) 0:00:20.575 ****** 11579 1726882191.86690: entering _queue_task() for managed_node1/include_tasks 11579 1726882191.86917: worker is 1 (out of 1 available) 11579 1726882191.86929: exiting _queue_task() for managed_node1/include_tasks 11579 1726882191.86939: done queuing things up, now waiting for results queue to drain 11579 1726882191.86941: waiting for pending results... 11579 1726882191.87111: running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' 11579 1726882191.87188: in run() - task 12673a56-9f93-f197-7423-00000000026d 11579 1726882191.87205: variable 'ansible_search_path' from source: unknown 11579 1726882191.87208: variable 'ansible_search_path' from source: unknown 11579 1726882191.87236: calling self._execute() 11579 1726882191.87311: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.87315: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.87326: variable 'omit' from source: magic vars 11579 1726882191.87578: variable 'ansible_distribution_major_version' from source: facts 11579 1726882191.87588: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882191.87595: _execute() done 11579 1726882191.87607: dumping result to json 11579 1726882191.87611: done dumping result, returning 11579 1726882191.87613: done running TaskExecutor() for managed_node1/TASK: Include the task 'get_profile_stat.yml' [12673a56-9f93-f197-7423-00000000026d] 11579 1726882191.87616: sending task result for task 12673a56-9f93-f197-7423-00000000026d 11579 1726882191.87696: done sending task result for task 12673a56-9f93-f197-7423-00000000026d 11579 1726882191.87699: WORKER PROCESS EXITING 11579 1726882191.87728: no more pending results, returning what we have 11579 1726882191.87733: in VariableManager get_vars() 11579 1726882191.87777: Calling all_inventory to load vars for managed_node1 11579 1726882191.87779: Calling groups_inventory to load vars for managed_node1 11579 1726882191.87781: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882191.87795: Calling all_plugins_play to load vars for managed_node1 11579 1726882191.87798: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882191.87801: Calling groups_plugins_play to load vars for managed_node1 11579 1726882191.88661: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882191.89500: done with get_vars() 11579 1726882191.89514: variable 'ansible_search_path' from source: unknown 11579 1726882191.89515: variable 'ansible_search_path' from source: unknown 11579 1726882191.89539: we have included files to process 11579 1726882191.89540: generating all_blocks data 11579 1726882191.89541: done generating all_blocks data 11579 1726882191.89544: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11579 1726882191.89545: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11579 1726882191.89547: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml 11579 1726882191.90126: done processing included file 11579 1726882191.90128: iterating over new_blocks loaded from include file 11579 1726882191.90129: in VariableManager get_vars() 11579 1726882191.90141: done with get_vars() 11579 1726882191.90143: filtering new block on tags 11579 1726882191.90157: done filtering new block on tags 11579 1726882191.90159: in VariableManager get_vars() 11579 1726882191.90171: done with get_vars() 11579 1726882191.90173: filtering new block on tags 11579 1726882191.90186: done filtering new block on tags 11579 1726882191.90187: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml for managed_node1 11579 1726882191.90190: extending task lists for all hosts with included blocks 11579 1726882191.90289: done extending task lists 11579 1726882191.90290: done processing included files 11579 1726882191.90291: results queue empty 11579 1726882191.90291: checking for any_errors_fatal 11579 1726882191.90294: done checking for any_errors_fatal 11579 1726882191.90295: checking for max_fail_percentage 11579 1726882191.90296: done checking for max_fail_percentage 11579 1726882191.90296: checking to see if all hosts have failed and the running result is not ok 11579 1726882191.90297: done checking to see if all hosts have failed 11579 1726882191.90297: getting the remaining hosts for this loop 11579 1726882191.90298: done getting the remaining hosts for this loop 11579 1726882191.90300: getting the next task for host managed_node1 11579 1726882191.90303: done getting next task for host managed_node1 11579 1726882191.90304: ^ task is: TASK: Initialize NM profile exist and ansible_managed comment flag 11579 1726882191.90306: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882191.90307: getting variables 11579 1726882191.90308: in VariableManager get_vars() 11579 1726882191.90317: Calling all_inventory to load vars for managed_node1 11579 1726882191.90318: Calling groups_inventory to load vars for managed_node1 11579 1726882191.90319: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882191.90323: Calling all_plugins_play to load vars for managed_node1 11579 1726882191.90324: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882191.90326: Calling groups_plugins_play to load vars for managed_node1 11579 1726882191.90958: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882191.91842: done with get_vars() 11579 1726882191.91856: done getting variables 11579 1726882191.91881: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Initialize NM profile exist and ansible_managed comment flag] ************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:3 Friday 20 September 2024 21:29:51 -0400 (0:00:00.052) 0:00:20.627 ****** 11579 1726882191.91905: entering _queue_task() for managed_node1/set_fact 11579 1726882191.92141: worker is 1 (out of 1 available) 11579 1726882191.92153: exiting _queue_task() for managed_node1/set_fact 11579 1726882191.92164: done queuing things up, now waiting for results queue to drain 11579 1726882191.92165: waiting for pending results... 11579 1726882191.92712: running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag 11579 1726882191.92719: in run() - task 12673a56-9f93-f197-7423-000000000440 11579 1726882191.92739: variable 'ansible_search_path' from source: unknown 11579 1726882191.92749: variable 'ansible_search_path' from source: unknown 11579 1726882191.92791: calling self._execute() 11579 1726882191.92912: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.92926: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.92954: variable 'omit' from source: magic vars 11579 1726882191.93374: variable 'ansible_distribution_major_version' from source: facts 11579 1726882191.93495: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882191.93499: variable 'omit' from source: magic vars 11579 1726882191.93502: variable 'omit' from source: magic vars 11579 1726882191.93515: variable 'omit' from source: magic vars 11579 1726882191.93563: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882191.93641: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882191.93662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882191.93678: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882191.93728: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882191.93731: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882191.93734: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.93736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.93984: Set connection var ansible_timeout to 10 11579 1726882191.94036: Set connection var ansible_shell_type to sh 11579 1726882191.94039: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882191.94042: Set connection var ansible_shell_executable to /bin/sh 11579 1726882191.94045: Set connection var ansible_pipelining to False 11579 1726882191.94055: Set connection var ansible_connection to ssh 11579 1726882191.94058: variable 'ansible_shell_executable' from source: unknown 11579 1726882191.94060: variable 'ansible_connection' from source: unknown 11579 1726882191.94063: variable 'ansible_module_compression' from source: unknown 11579 1726882191.94065: variable 'ansible_shell_type' from source: unknown 11579 1726882191.94067: variable 'ansible_shell_executable' from source: unknown 11579 1726882191.94069: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.94071: variable 'ansible_pipelining' from source: unknown 11579 1726882191.94073: variable 'ansible_timeout' from source: unknown 11579 1726882191.94075: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.94255: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882191.94260: variable 'omit' from source: magic vars 11579 1726882191.94267: starting attempt loop 11579 1726882191.94273: running the handler 11579 1726882191.94276: handler run complete 11579 1726882191.94278: attempt loop complete, returning result 11579 1726882191.94280: _execute() done 11579 1726882191.94282: dumping result to json 11579 1726882191.94284: done dumping result, returning 11579 1726882191.94286: done running TaskExecutor() for managed_node1/TASK: Initialize NM profile exist and ansible_managed comment flag [12673a56-9f93-f197-7423-000000000440] 11579 1726882191.94288: sending task result for task 12673a56-9f93-f197-7423-000000000440 11579 1726882191.94366: done sending task result for task 12673a56-9f93-f197-7423-000000000440 11579 1726882191.94369: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": false, "lsr_net_profile_exists": false, "lsr_net_profile_fingerprint": false }, "changed": false } 11579 1726882191.94429: no more pending results, returning what we have 11579 1726882191.94432: results queue empty 11579 1726882191.94433: checking for any_errors_fatal 11579 1726882191.94434: done checking for any_errors_fatal 11579 1726882191.94435: checking for max_fail_percentage 11579 1726882191.94436: done checking for max_fail_percentage 11579 1726882191.94437: checking to see if all hosts have failed and the running result is not ok 11579 1726882191.94438: done checking to see if all hosts have failed 11579 1726882191.94439: getting the remaining hosts for this loop 11579 1726882191.94441: done getting the remaining hosts for this loop 11579 1726882191.94444: getting the next task for host managed_node1 11579 1726882191.94450: done getting next task for host managed_node1 11579 1726882191.94452: ^ task is: TASK: Stat profile file 11579 1726882191.94456: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882191.94460: getting variables 11579 1726882191.94462: in VariableManager get_vars() 11579 1726882191.94506: Calling all_inventory to load vars for managed_node1 11579 1726882191.94509: Calling groups_inventory to load vars for managed_node1 11579 1726882191.94511: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882191.94522: Calling all_plugins_play to load vars for managed_node1 11579 1726882191.94524: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882191.94527: Calling groups_plugins_play to load vars for managed_node1 11579 1726882191.95862: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882191.97350: done with get_vars() 11579 1726882191.97369: done getting variables TASK [Stat profile file] ******************************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:9 Friday 20 September 2024 21:29:51 -0400 (0:00:00.055) 0:00:20.682 ****** 11579 1726882191.97455: entering _queue_task() for managed_node1/stat 11579 1726882191.97834: worker is 1 (out of 1 available) 11579 1726882191.97846: exiting _queue_task() for managed_node1/stat 11579 1726882191.97857: done queuing things up, now waiting for results queue to drain 11579 1726882191.97858: waiting for pending results... 11579 1726882191.98251: running TaskExecutor() for managed_node1/TASK: Stat profile file 11579 1726882191.98256: in run() - task 12673a56-9f93-f197-7423-000000000441 11579 1726882191.98318: variable 'ansible_search_path' from source: unknown 11579 1726882191.98322: variable 'ansible_search_path' from source: unknown 11579 1726882191.98326: calling self._execute() 11579 1726882191.98440: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.98457: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.98467: variable 'omit' from source: magic vars 11579 1726882191.98772: variable 'ansible_distribution_major_version' from source: facts 11579 1726882191.98783: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882191.98788: variable 'omit' from source: magic vars 11579 1726882191.98829: variable 'omit' from source: magic vars 11579 1726882191.98901: variable 'profile' from source: include params 11579 1726882191.98905: variable 'item' from source: include params 11579 1726882191.98950: variable 'item' from source: include params 11579 1726882191.98965: variable 'omit' from source: magic vars 11579 1726882191.99001: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882191.99029: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882191.99045: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882191.99058: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882191.99067: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882191.99092: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882191.99097: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.99105: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.99171: Set connection var ansible_timeout to 10 11579 1726882191.99175: Set connection var ansible_shell_type to sh 11579 1726882191.99184: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882191.99187: Set connection var ansible_shell_executable to /bin/sh 11579 1726882191.99195: Set connection var ansible_pipelining to False 11579 1726882191.99201: Set connection var ansible_connection to ssh 11579 1726882191.99219: variable 'ansible_shell_executable' from source: unknown 11579 1726882191.99223: variable 'ansible_connection' from source: unknown 11579 1726882191.99225: variable 'ansible_module_compression' from source: unknown 11579 1726882191.99227: variable 'ansible_shell_type' from source: unknown 11579 1726882191.99229: variable 'ansible_shell_executable' from source: unknown 11579 1726882191.99232: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882191.99234: variable 'ansible_pipelining' from source: unknown 11579 1726882191.99237: variable 'ansible_timeout' from source: unknown 11579 1726882191.99241: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882191.99387: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11579 1726882191.99397: variable 'omit' from source: magic vars 11579 1726882191.99410: starting attempt loop 11579 1726882191.99413: running the handler 11579 1726882191.99420: _low_level_execute_command(): starting 11579 1726882191.99430: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882191.99929: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882191.99934: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882191.99938: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882191.99988: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882191.99991: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882191.99999: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882192.00051: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882192.01719: stdout chunk (state=3): >>>/root <<< 11579 1726882192.01875: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882192.01878: stdout chunk (state=3): >>><<< 11579 1726882192.01881: stderr chunk (state=3): >>><<< 11579 1726882192.01909: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882192.02009: _low_level_execute_command(): starting 11579 1726882192.02015: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882192.0191877-12553-194479015798024 `" && echo ansible-tmp-1726882192.0191877-12553-194479015798024="` echo /root/.ansible/tmp/ansible-tmp-1726882192.0191877-12553-194479015798024 `" ) && sleep 0' 11579 1726882192.02785: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882192.03002: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882192.03033: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882192.04903: stdout chunk (state=3): >>>ansible-tmp-1726882192.0191877-12553-194479015798024=/root/.ansible/tmp/ansible-tmp-1726882192.0191877-12553-194479015798024 <<< 11579 1726882192.05066: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882192.05069: stdout chunk (state=3): >>><<< 11579 1726882192.05071: stderr chunk (state=3): >>><<< 11579 1726882192.05300: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882192.0191877-12553-194479015798024=/root/.ansible/tmp/ansible-tmp-1726882192.0191877-12553-194479015798024 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882192.05303: variable 'ansible_module_compression' from source: unknown 11579 1726882192.05306: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.stat-ZIP_DEFLATED 11579 1726882192.05308: variable 'ansible_facts' from source: unknown 11579 1726882192.05351: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882192.0191877-12553-194479015798024/AnsiballZ_stat.py 11579 1726882192.05552: Sending initial data 11579 1726882192.05563: Sent initial data (153 bytes) 11579 1726882192.06677: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882192.06784: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882192.06820: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882192.06890: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882192.08410: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882192.08449: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882192.08512: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmp2fpv___g /root/.ansible/tmp/ansible-tmp-1726882192.0191877-12553-194479015798024/AnsiballZ_stat.py <<< 11579 1726882192.08516: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882192.0191877-12553-194479015798024/AnsiballZ_stat.py" <<< 11579 1726882192.08569: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmp2fpv___g" to remote "/root/.ansible/tmp/ansible-tmp-1726882192.0191877-12553-194479015798024/AnsiballZ_stat.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882192.0191877-12553-194479015798024/AnsiballZ_stat.py" <<< 11579 1726882192.09864: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882192.09874: stdout chunk (state=3): >>><<< 11579 1726882192.09885: stderr chunk (state=3): >>><<< 11579 1726882192.09913: done transferring module to remote 11579 1726882192.10076: _low_level_execute_command(): starting 11579 1726882192.10079: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882192.0191877-12553-194479015798024/ /root/.ansible/tmp/ansible-tmp-1726882192.0191877-12553-194479015798024/AnsiballZ_stat.py && sleep 0' 11579 1726882192.11092: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882192.11308: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882192.11324: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882192.11342: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882192.11401: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882192.11413: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882192.13118: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882192.13223: stderr chunk (state=3): >>><<< 11579 1726882192.13226: stdout chunk (state=3): >>><<< 11579 1726882192.13400: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882192.13407: _low_level_execute_command(): starting 11579 1726882192.13409: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882192.0191877-12553-194479015798024/AnsiballZ_stat.py && sleep 0' 11579 1726882192.14569: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882192.14813: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882192.14897: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882192.30201: stdout chunk (state=3): >>> {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} <<< 11579 1726882192.31540: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882192.31544: stdout chunk (state=3): >>><<< 11579 1726882192.31547: stderr chunk (state=3): >>><<< 11579 1726882192.31550: _low_level_execute_command() done: rc=0, stdout= {"changed": false, "stat": {"exists": false}, "invocation": {"module_args": {"get_attributes": false, "get_checksum": false, "get_mime": false, "path": "/etc/sysconfig/network-scripts/ifcfg-bond0.1", "follow": false, "checksum_algorithm": "sha1"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882192.31552: done with _execute_module (stat, {'get_attributes': False, 'get_checksum': False, 'get_mime': False, 'path': '/etc/sysconfig/network-scripts/ifcfg-bond0.1', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'stat', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882192.0191877-12553-194479015798024/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882192.31737: _low_level_execute_command(): starting 11579 1726882192.31741: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882192.0191877-12553-194479015798024/ > /dev/null 2>&1 && sleep 0' 11579 1726882192.32626: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882192.32630: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 11579 1726882192.32644: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11579 1726882192.32647: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882192.32691: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882192.32708: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882192.32742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882192.32810: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882192.34688: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882192.34708: stdout chunk (state=3): >>><<< 11579 1726882192.34720: stderr chunk (state=3): >>><<< 11579 1726882192.34740: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882192.34752: handler run complete 11579 1726882192.34776: attempt loop complete, returning result 11579 1726882192.34783: _execute() done 11579 1726882192.34789: dumping result to json 11579 1726882192.34805: done dumping result, returning 11579 1726882192.34818: done running TaskExecutor() for managed_node1/TASK: Stat profile file [12673a56-9f93-f197-7423-000000000441] 11579 1726882192.34833: sending task result for task 12673a56-9f93-f197-7423-000000000441 ok: [managed_node1] => { "changed": false, "stat": { "exists": false } } 11579 1726882192.35062: no more pending results, returning what we have 11579 1726882192.35066: results queue empty 11579 1726882192.35067: checking for any_errors_fatal 11579 1726882192.35075: done checking for any_errors_fatal 11579 1726882192.35077: checking for max_fail_percentage 11579 1726882192.35079: done checking for max_fail_percentage 11579 1726882192.35080: checking to see if all hosts have failed and the running result is not ok 11579 1726882192.35081: done checking to see if all hosts have failed 11579 1726882192.35082: getting the remaining hosts for this loop 11579 1726882192.35084: done getting the remaining hosts for this loop 11579 1726882192.35088: getting the next task for host managed_node1 11579 1726882192.35100: done getting next task for host managed_node1 11579 1726882192.35103: ^ task is: TASK: Set NM profile exist flag based on the profile files 11579 1726882192.35107: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882192.35112: getting variables 11579 1726882192.35114: in VariableManager get_vars() 11579 1726882192.35163: Calling all_inventory to load vars for managed_node1 11579 1726882192.35166: Calling groups_inventory to load vars for managed_node1 11579 1726882192.35169: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882192.35183: Calling all_plugins_play to load vars for managed_node1 11579 1726882192.35187: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882192.35191: Calling groups_plugins_play to load vars for managed_node1 11579 1726882192.35412: done sending task result for task 12673a56-9f93-f197-7423-000000000441 11579 1726882192.35415: WORKER PROCESS EXITING 11579 1726882192.37150: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882192.38708: done with get_vars() 11579 1726882192.38734: done getting variables 11579 1726882192.38792: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag based on the profile files] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:17 Friday 20 September 2024 21:29:52 -0400 (0:00:00.413) 0:00:21.096 ****** 11579 1726882192.38829: entering _queue_task() for managed_node1/set_fact 11579 1726882192.39159: worker is 1 (out of 1 available) 11579 1726882192.39172: exiting _queue_task() for managed_node1/set_fact 11579 1726882192.39183: done queuing things up, now waiting for results queue to drain 11579 1726882192.39185: waiting for pending results... 11579 1726882192.39455: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files 11579 1726882192.39572: in run() - task 12673a56-9f93-f197-7423-000000000442 11579 1726882192.39592: variable 'ansible_search_path' from source: unknown 11579 1726882192.39604: variable 'ansible_search_path' from source: unknown 11579 1726882192.39649: calling self._execute() 11579 1726882192.39746: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882192.39756: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882192.39769: variable 'omit' from source: magic vars 11579 1726882192.40130: variable 'ansible_distribution_major_version' from source: facts 11579 1726882192.40146: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882192.40258: variable 'profile_stat' from source: set_fact 11579 1726882192.40279: Evaluated conditional (profile_stat.stat.exists): False 11579 1726882192.40285: when evaluation is False, skipping this task 11579 1726882192.40291: _execute() done 11579 1726882192.40302: dumping result to json 11579 1726882192.40308: done dumping result, returning 11579 1726882192.40318: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag based on the profile files [12673a56-9f93-f197-7423-000000000442] 11579 1726882192.40326: sending task result for task 12673a56-9f93-f197-7423-000000000442 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11579 1726882192.40463: no more pending results, returning what we have 11579 1726882192.40467: results queue empty 11579 1726882192.40468: checking for any_errors_fatal 11579 1726882192.40480: done checking for any_errors_fatal 11579 1726882192.40481: checking for max_fail_percentage 11579 1726882192.40483: done checking for max_fail_percentage 11579 1726882192.40484: checking to see if all hosts have failed and the running result is not ok 11579 1726882192.40485: done checking to see if all hosts have failed 11579 1726882192.40486: getting the remaining hosts for this loop 11579 1726882192.40488: done getting the remaining hosts for this loop 11579 1726882192.40491: getting the next task for host managed_node1 11579 1726882192.40504: done getting next task for host managed_node1 11579 1726882192.40506: ^ task is: TASK: Get NM profile info 11579 1726882192.40510: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882192.40515: getting variables 11579 1726882192.40517: in VariableManager get_vars() 11579 1726882192.40557: Calling all_inventory to load vars for managed_node1 11579 1726882192.40560: Calling groups_inventory to load vars for managed_node1 11579 1726882192.40562: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882192.40574: Calling all_plugins_play to load vars for managed_node1 11579 1726882192.40577: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882192.40579: Calling groups_plugins_play to load vars for managed_node1 11579 1726882192.41208: done sending task result for task 12673a56-9f93-f197-7423-000000000442 11579 1726882192.41211: WORKER PROCESS EXITING 11579 1726882192.42114: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882192.43644: done with get_vars() 11579 1726882192.43667: done getting variables 11579 1726882192.43734: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Get NM profile info] ***************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:25 Friday 20 September 2024 21:29:52 -0400 (0:00:00.049) 0:00:21.146 ****** 11579 1726882192.43766: entering _queue_task() for managed_node1/shell 11579 1726882192.44128: worker is 1 (out of 1 available) 11579 1726882192.44142: exiting _queue_task() for managed_node1/shell 11579 1726882192.44154: done queuing things up, now waiting for results queue to drain 11579 1726882192.44155: waiting for pending results... 11579 1726882192.44425: running TaskExecutor() for managed_node1/TASK: Get NM profile info 11579 1726882192.44526: in run() - task 12673a56-9f93-f197-7423-000000000443 11579 1726882192.44542: variable 'ansible_search_path' from source: unknown 11579 1726882192.44546: variable 'ansible_search_path' from source: unknown 11579 1726882192.44587: calling self._execute() 11579 1726882192.44677: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882192.44681: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882192.44701: variable 'omit' from source: magic vars 11579 1726882192.45066: variable 'ansible_distribution_major_version' from source: facts 11579 1726882192.45080: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882192.45086: variable 'omit' from source: magic vars 11579 1726882192.45137: variable 'omit' from source: magic vars 11579 1726882192.45273: variable 'profile' from source: include params 11579 1726882192.45277: variable 'item' from source: include params 11579 1726882192.45301: variable 'item' from source: include params 11579 1726882192.45320: variable 'omit' from source: magic vars 11579 1726882192.45364: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882192.45411: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882192.45428: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882192.45431: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882192.45490: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882192.45497: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882192.45501: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882192.45503: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882192.45587: Set connection var ansible_timeout to 10 11579 1726882192.45603: Set connection var ansible_shell_type to sh 11579 1726882192.45606: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882192.45609: Set connection var ansible_shell_executable to /bin/sh 11579 1726882192.45713: Set connection var ansible_pipelining to False 11579 1726882192.45716: Set connection var ansible_connection to ssh 11579 1726882192.45719: variable 'ansible_shell_executable' from source: unknown 11579 1726882192.45721: variable 'ansible_connection' from source: unknown 11579 1726882192.45723: variable 'ansible_module_compression' from source: unknown 11579 1726882192.45725: variable 'ansible_shell_type' from source: unknown 11579 1726882192.45727: variable 'ansible_shell_executable' from source: unknown 11579 1726882192.45729: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882192.45731: variable 'ansible_pipelining' from source: unknown 11579 1726882192.45734: variable 'ansible_timeout' from source: unknown 11579 1726882192.45736: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882192.45801: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882192.45812: variable 'omit' from source: magic vars 11579 1726882192.45819: starting attempt loop 11579 1726882192.45822: running the handler 11579 1726882192.45830: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882192.45851: _low_level_execute_command(): starting 11579 1726882192.45858: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882192.46581: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882192.46596: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882192.46607: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882192.46623: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882192.46636: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882192.46643: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882192.46653: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882192.46667: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882192.46699: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 11579 1726882192.46703: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11579 1726882192.46705: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882192.46797: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882192.46835: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882192.46904: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882192.48481: stdout chunk (state=3): >>>/root <<< 11579 1726882192.48588: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882192.48620: stderr chunk (state=3): >>><<< 11579 1726882192.48626: stdout chunk (state=3): >>><<< 11579 1726882192.48685: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882192.48689: _low_level_execute_command(): starting 11579 1726882192.48692: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882192.486529-12589-219022410818335 `" && echo ansible-tmp-1726882192.486529-12589-219022410818335="` echo /root/.ansible/tmp/ansible-tmp-1726882192.486529-12589-219022410818335 `" ) && sleep 0' 11579 1726882192.49302: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882192.49306: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882192.49308: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882192.49311: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882192.49313: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882192.49381: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882192.49406: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882192.49410: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882192.49532: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882192.51365: stdout chunk (state=3): >>>ansible-tmp-1726882192.486529-12589-219022410818335=/root/.ansible/tmp/ansible-tmp-1726882192.486529-12589-219022410818335 <<< 11579 1726882192.51479: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882192.51508: stderr chunk (state=3): >>><<< 11579 1726882192.51511: stdout chunk (state=3): >>><<< 11579 1726882192.51527: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882192.486529-12589-219022410818335=/root/.ansible/tmp/ansible-tmp-1726882192.486529-12589-219022410818335 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882192.51550: variable 'ansible_module_compression' from source: unknown 11579 1726882192.51639: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11579 1726882192.51653: variable 'ansible_facts' from source: unknown 11579 1726882192.51737: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882192.486529-12589-219022410818335/AnsiballZ_command.py 11579 1726882192.51915: Sending initial data 11579 1726882192.51918: Sent initial data (155 bytes) 11579 1726882192.52423: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882192.52505: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882192.52509: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882192.52511: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882192.52513: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882192.52515: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882192.52517: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882192.52519: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882192.52520: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 11579 1726882192.52522: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11579 1726882192.52578: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882192.52612: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882192.52664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882192.54159: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11579 1726882192.54167: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882192.54200: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882192.54239: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpqoyroqtr /root/.ansible/tmp/ansible-tmp-1726882192.486529-12589-219022410818335/AnsiballZ_command.py <<< 11579 1726882192.54250: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882192.486529-12589-219022410818335/AnsiballZ_command.py" <<< 11579 1726882192.54279: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpqoyroqtr" to remote "/root/.ansible/tmp/ansible-tmp-1726882192.486529-12589-219022410818335/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882192.486529-12589-219022410818335/AnsiballZ_command.py" <<< 11579 1726882192.55045: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882192.55050: stdout chunk (state=3): >>><<< 11579 1726882192.55052: stderr chunk (state=3): >>><<< 11579 1726882192.55054: done transferring module to remote 11579 1726882192.55056: _low_level_execute_command(): starting 11579 1726882192.55058: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882192.486529-12589-219022410818335/ /root/.ansible/tmp/ansible-tmp-1726882192.486529-12589-219022410818335/AnsiballZ_command.py && sleep 0' 11579 1726882192.55666: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882192.55669: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882192.55672: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882192.55674: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882192.55715: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882192.57412: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882192.57434: stderr chunk (state=3): >>><<< 11579 1726882192.57437: stdout chunk (state=3): >>><<< 11579 1726882192.57449: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882192.57452: _low_level_execute_command(): starting 11579 1726882192.57458: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882192.486529-12589-219022410818335/AnsiballZ_command.py && sleep 0' 11579 1726882192.57917: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882192.57946: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882192.57960: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882192.58030: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882192.58072: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882192.58089: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882192.58115: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882192.58202: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882192.74925: stdout chunk (state=3): >>> {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:29:52.727986", "end": "2024-09-20 21:29:52.747832", "delta": "0:00:00.019846", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11579 1726882192.76302: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882192.76325: stderr chunk (state=3): >>><<< 11579 1726882192.76328: stdout chunk (state=3): >>><<< 11579 1726882192.76348: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection ", "stderr": "", "rc": 0, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "start": "2024-09-20 21:29:52.727986", "end": "2024-09-20 21:29:52.747832", "delta": "0:00:00.019846", "msg": "", "invocation": {"module_args": {"_raw_params": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882192.76374: done with _execute_module (ansible.legacy.command, {'_raw_params': 'nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882192.486529-12589-219022410818335/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882192.76383: _low_level_execute_command(): starting 11579 1726882192.76386: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882192.486529-12589-219022410818335/ > /dev/null 2>&1 && sleep 0' 11579 1726882192.76829: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882192.76832: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882192.76834: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 11579 1726882192.76836: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882192.76838: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882192.76886: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882192.76889: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882192.76895: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882192.76936: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882192.78742: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882192.78745: stdout chunk (state=3): >>><<< 11579 1726882192.78747: stderr chunk (state=3): >>><<< 11579 1726882192.78762: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882192.78898: handler run complete 11579 1726882192.78902: Evaluated conditional (False): False 11579 1726882192.78904: attempt loop complete, returning result 11579 1726882192.78906: _execute() done 11579 1726882192.78908: dumping result to json 11579 1726882192.78909: done dumping result, returning 11579 1726882192.78911: done running TaskExecutor() for managed_node1/TASK: Get NM profile info [12673a56-9f93-f197-7423-000000000443] 11579 1726882192.78913: sending task result for task 12673a56-9f93-f197-7423-000000000443 11579 1726882192.78978: done sending task result for task 12673a56-9f93-f197-7423-000000000443 11579 1726882192.78981: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "nmcli -f NAME,FILENAME connection show |grep bond0.1 | grep /etc", "delta": "0:00:00.019846", "end": "2024-09-20 21:29:52.747832", "rc": 0, "start": "2024-09-20 21:29:52.727986" } STDOUT: bond0.1 /etc/NetworkManager/system-connections/bond0.1.nmconnection 11579 1726882192.79057: no more pending results, returning what we have 11579 1726882192.79060: results queue empty 11579 1726882192.79061: checking for any_errors_fatal 11579 1726882192.79066: done checking for any_errors_fatal 11579 1726882192.79067: checking for max_fail_percentage 11579 1726882192.79069: done checking for max_fail_percentage 11579 1726882192.79070: checking to see if all hosts have failed and the running result is not ok 11579 1726882192.79071: done checking to see if all hosts have failed 11579 1726882192.79071: getting the remaining hosts for this loop 11579 1726882192.79073: done getting the remaining hosts for this loop 11579 1726882192.79076: getting the next task for host managed_node1 11579 1726882192.79083: done getting next task for host managed_node1 11579 1726882192.79084: ^ task is: TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11579 1726882192.79088: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882192.79091: getting variables 11579 1726882192.79097: in VariableManager get_vars() 11579 1726882192.79138: Calling all_inventory to load vars for managed_node1 11579 1726882192.79140: Calling groups_inventory to load vars for managed_node1 11579 1726882192.79142: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882192.79153: Calling all_plugins_play to load vars for managed_node1 11579 1726882192.79155: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882192.79158: Calling groups_plugins_play to load vars for managed_node1 11579 1726882192.80650: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882192.82231: done with get_vars() 11579 1726882192.82255: done getting variables 11579 1726882192.82319: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Set NM profile exist flag and ansible_managed flag true based on the nmcli output] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:35 Friday 20 September 2024 21:29:52 -0400 (0:00:00.385) 0:00:21.531 ****** 11579 1726882192.82351: entering _queue_task() for managed_node1/set_fact 11579 1726882192.82664: worker is 1 (out of 1 available) 11579 1726882192.82677: exiting _queue_task() for managed_node1/set_fact 11579 1726882192.82688: done queuing things up, now waiting for results queue to drain 11579 1726882192.82690: waiting for pending results... 11579 1726882192.82960: running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output 11579 1726882192.83077: in run() - task 12673a56-9f93-f197-7423-000000000444 11579 1726882192.83101: variable 'ansible_search_path' from source: unknown 11579 1726882192.83109: variable 'ansible_search_path' from source: unknown 11579 1726882192.83300: calling self._execute() 11579 1726882192.83303: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882192.83306: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882192.83310: variable 'omit' from source: magic vars 11579 1726882192.83619: variable 'ansible_distribution_major_version' from source: facts 11579 1726882192.83642: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882192.83774: variable 'nm_profile_exists' from source: set_fact 11579 1726882192.83794: Evaluated conditional (nm_profile_exists.rc == 0): True 11579 1726882192.83806: variable 'omit' from source: magic vars 11579 1726882192.83859: variable 'omit' from source: magic vars 11579 1726882192.83896: variable 'omit' from source: magic vars 11579 1726882192.83943: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882192.83987: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882192.84013: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882192.84037: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882192.84052: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882192.84088: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882192.84098: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882192.84106: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882192.84209: Set connection var ansible_timeout to 10 11579 1726882192.84223: Set connection var ansible_shell_type to sh 11579 1726882192.84236: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882192.84245: Set connection var ansible_shell_executable to /bin/sh 11579 1726882192.84255: Set connection var ansible_pipelining to False 11579 1726882192.84261: Set connection var ansible_connection to ssh 11579 1726882192.84284: variable 'ansible_shell_executable' from source: unknown 11579 1726882192.84298: variable 'ansible_connection' from source: unknown 11579 1726882192.84401: variable 'ansible_module_compression' from source: unknown 11579 1726882192.84404: variable 'ansible_shell_type' from source: unknown 11579 1726882192.84406: variable 'ansible_shell_executable' from source: unknown 11579 1726882192.84408: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882192.84410: variable 'ansible_pipelining' from source: unknown 11579 1726882192.84412: variable 'ansible_timeout' from source: unknown 11579 1726882192.84414: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882192.84469: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882192.84484: variable 'omit' from source: magic vars 11579 1726882192.84495: starting attempt loop 11579 1726882192.84504: running the handler 11579 1726882192.84524: handler run complete 11579 1726882192.84539: attempt loop complete, returning result 11579 1726882192.84546: _execute() done 11579 1726882192.84554: dumping result to json 11579 1726882192.84561: done dumping result, returning 11579 1726882192.84572: done running TaskExecutor() for managed_node1/TASK: Set NM profile exist flag and ansible_managed flag true based on the nmcli output [12673a56-9f93-f197-7423-000000000444] 11579 1726882192.84580: sending task result for task 12673a56-9f93-f197-7423-000000000444 11579 1726882192.84799: done sending task result for task 12673a56-9f93-f197-7423-000000000444 11579 1726882192.84804: WORKER PROCESS EXITING ok: [managed_node1] => { "ansible_facts": { "lsr_net_profile_ansible_managed": true, "lsr_net_profile_exists": true, "lsr_net_profile_fingerprint": true }, "changed": false } 11579 1726882192.84856: no more pending results, returning what we have 11579 1726882192.84859: results queue empty 11579 1726882192.84860: checking for any_errors_fatal 11579 1726882192.84869: done checking for any_errors_fatal 11579 1726882192.84870: checking for max_fail_percentage 11579 1726882192.84872: done checking for max_fail_percentage 11579 1726882192.84873: checking to see if all hosts have failed and the running result is not ok 11579 1726882192.84874: done checking to see if all hosts have failed 11579 1726882192.84874: getting the remaining hosts for this loop 11579 1726882192.84876: done getting the remaining hosts for this loop 11579 1726882192.84879: getting the next task for host managed_node1 11579 1726882192.84888: done getting next task for host managed_node1 11579 1726882192.84890: ^ task is: TASK: Get the ansible_managed comment in ifcfg-{{ profile }} 11579 1726882192.84896: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882192.84901: getting variables 11579 1726882192.84903: in VariableManager get_vars() 11579 1726882192.84942: Calling all_inventory to load vars for managed_node1 11579 1726882192.84945: Calling groups_inventory to load vars for managed_node1 11579 1726882192.84948: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882192.84959: Calling all_plugins_play to load vars for managed_node1 11579 1726882192.84962: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882192.84965: Calling groups_plugins_play to load vars for managed_node1 11579 1726882192.86335: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882192.87967: done with get_vars() 11579 1726882192.87988: done getting variables 11579 1726882192.88048: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882192.88154: variable 'profile' from source: include params 11579 1726882192.88158: variable 'item' from source: include params 11579 1726882192.88219: variable 'item' from source: include params TASK [Get the ansible_managed comment in ifcfg-bond0.1] ************************ task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:49 Friday 20 September 2024 21:29:52 -0400 (0:00:00.059) 0:00:21.590 ****** 11579 1726882192.88255: entering _queue_task() for managed_node1/command 11579 1726882192.88540: worker is 1 (out of 1 available) 11579 1726882192.88552: exiting _queue_task() for managed_node1/command 11579 1726882192.88563: done queuing things up, now waiting for results queue to drain 11579 1726882192.88565: waiting for pending results... 11579 1726882192.88827: running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 11579 1726882192.89001: in run() - task 12673a56-9f93-f197-7423-000000000446 11579 1726882192.89005: variable 'ansible_search_path' from source: unknown 11579 1726882192.89007: variable 'ansible_search_path' from source: unknown 11579 1726882192.89011: calling self._execute() 11579 1726882192.89101: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882192.89126: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882192.89133: variable 'omit' from source: magic vars 11579 1726882192.89559: variable 'ansible_distribution_major_version' from source: facts 11579 1726882192.89563: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882192.89608: variable 'profile_stat' from source: set_fact 11579 1726882192.89627: Evaluated conditional (profile_stat.stat.exists): False 11579 1726882192.89635: when evaluation is False, skipping this task 11579 1726882192.89642: _execute() done 11579 1726882192.89648: dumping result to json 11579 1726882192.89655: done dumping result, returning 11579 1726882192.89665: done running TaskExecutor() for managed_node1/TASK: Get the ansible_managed comment in ifcfg-bond0.1 [12673a56-9f93-f197-7423-000000000446] 11579 1726882192.89678: sending task result for task 12673a56-9f93-f197-7423-000000000446 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11579 1726882192.89832: no more pending results, returning what we have 11579 1726882192.89836: results queue empty 11579 1726882192.89837: checking for any_errors_fatal 11579 1726882192.89843: done checking for any_errors_fatal 11579 1726882192.89843: checking for max_fail_percentage 11579 1726882192.89845: done checking for max_fail_percentage 11579 1726882192.89846: checking to see if all hosts have failed and the running result is not ok 11579 1726882192.89847: done checking to see if all hosts have failed 11579 1726882192.89848: getting the remaining hosts for this loop 11579 1726882192.89850: done getting the remaining hosts for this loop 11579 1726882192.89854: getting the next task for host managed_node1 11579 1726882192.89861: done getting next task for host managed_node1 11579 1726882192.89864: ^ task is: TASK: Verify the ansible_managed comment in ifcfg-{{ profile }} 11579 1726882192.89869: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882192.89873: getting variables 11579 1726882192.89875: in VariableManager get_vars() 11579 1726882192.89920: Calling all_inventory to load vars for managed_node1 11579 1726882192.89923: Calling groups_inventory to load vars for managed_node1 11579 1726882192.89926: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882192.89940: Calling all_plugins_play to load vars for managed_node1 11579 1726882192.89943: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882192.89946: Calling groups_plugins_play to load vars for managed_node1 11579 1726882192.90606: done sending task result for task 12673a56-9f93-f197-7423-000000000446 11579 1726882192.90610: WORKER PROCESS EXITING 11579 1726882192.91446: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882192.93009: done with get_vars() 11579 1726882192.93031: done getting variables 11579 1726882192.93088: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882192.93191: variable 'profile' from source: include params 11579 1726882192.93196: variable 'item' from source: include params 11579 1726882192.93254: variable 'item' from source: include params TASK [Verify the ansible_managed comment in ifcfg-bond0.1] ********************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:56 Friday 20 September 2024 21:29:52 -0400 (0:00:00.050) 0:00:21.641 ****** 11579 1726882192.93285: entering _queue_task() for managed_node1/set_fact 11579 1726882192.93598: worker is 1 (out of 1 available) 11579 1726882192.93610: exiting _queue_task() for managed_node1/set_fact 11579 1726882192.93623: done queuing things up, now waiting for results queue to drain 11579 1726882192.93624: waiting for pending results... 11579 1726882192.93897: running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 11579 1726882192.94020: in run() - task 12673a56-9f93-f197-7423-000000000447 11579 1726882192.94040: variable 'ansible_search_path' from source: unknown 11579 1726882192.94048: variable 'ansible_search_path' from source: unknown 11579 1726882192.94088: calling self._execute() 11579 1726882192.94191: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882192.94206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882192.94226: variable 'omit' from source: magic vars 11579 1726882192.94582: variable 'ansible_distribution_major_version' from source: facts 11579 1726882192.94603: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882192.94733: variable 'profile_stat' from source: set_fact 11579 1726882192.94750: Evaluated conditional (profile_stat.stat.exists): False 11579 1726882192.94757: when evaluation is False, skipping this task 11579 1726882192.94764: _execute() done 11579 1726882192.94773: dumping result to json 11579 1726882192.94784: done dumping result, returning 11579 1726882192.94796: done running TaskExecutor() for managed_node1/TASK: Verify the ansible_managed comment in ifcfg-bond0.1 [12673a56-9f93-f197-7423-000000000447] 11579 1726882192.94806: sending task result for task 12673a56-9f93-f197-7423-000000000447 skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11579 1726882192.94941: no more pending results, returning what we have 11579 1726882192.94945: results queue empty 11579 1726882192.94946: checking for any_errors_fatal 11579 1726882192.94951: done checking for any_errors_fatal 11579 1726882192.94952: checking for max_fail_percentage 11579 1726882192.94953: done checking for max_fail_percentage 11579 1726882192.94954: checking to see if all hosts have failed and the running result is not ok 11579 1726882192.94956: done checking to see if all hosts have failed 11579 1726882192.94957: getting the remaining hosts for this loop 11579 1726882192.94959: done getting the remaining hosts for this loop 11579 1726882192.94962: getting the next task for host managed_node1 11579 1726882192.94969: done getting next task for host managed_node1 11579 1726882192.94971: ^ task is: TASK: Get the fingerprint comment in ifcfg-{{ profile }} 11579 1726882192.94976: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882192.94980: getting variables 11579 1726882192.94982: in VariableManager get_vars() 11579 1726882192.95027: Calling all_inventory to load vars for managed_node1 11579 1726882192.95030: Calling groups_inventory to load vars for managed_node1 11579 1726882192.95033: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882192.95047: Calling all_plugins_play to load vars for managed_node1 11579 1726882192.95050: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882192.95053: Calling groups_plugins_play to load vars for managed_node1 11579 1726882192.95896: done sending task result for task 12673a56-9f93-f197-7423-000000000447 11579 1726882192.95899: WORKER PROCESS EXITING 11579 1726882192.96787: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882193.05378: done with get_vars() 11579 1726882193.05412: done getting variables 11579 1726882193.05460: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882193.05763: variable 'profile' from source: include params 11579 1726882193.05766: variable 'item' from source: include params 11579 1726882193.05826: variable 'item' from source: include params TASK [Get the fingerprint comment in ifcfg-bond0.1] **************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:62 Friday 20 September 2024 21:29:53 -0400 (0:00:00.125) 0:00:21.766 ****** 11579 1726882193.05854: entering _queue_task() for managed_node1/command 11579 1726882193.06597: worker is 1 (out of 1 available) 11579 1726882193.06611: exiting _queue_task() for managed_node1/command 11579 1726882193.06622: done queuing things up, now waiting for results queue to drain 11579 1726882193.06624: waiting for pending results... 11579 1726882193.07122: running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 11579 1726882193.07603: in run() - task 12673a56-9f93-f197-7423-000000000448 11579 1726882193.07606: variable 'ansible_search_path' from source: unknown 11579 1726882193.07609: variable 'ansible_search_path' from source: unknown 11579 1726882193.07611: calling self._execute() 11579 1726882193.07722: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.07734: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.07750: variable 'omit' from source: magic vars 11579 1726882193.08522: variable 'ansible_distribution_major_version' from source: facts 11579 1726882193.08540: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882193.08815: variable 'profile_stat' from source: set_fact 11579 1726882193.08833: Evaluated conditional (profile_stat.stat.exists): False 11579 1726882193.09009: when evaluation is False, skipping this task 11579 1726882193.09013: _execute() done 11579 1726882193.09016: dumping result to json 11579 1726882193.09019: done dumping result, returning 11579 1726882193.09021: done running TaskExecutor() for managed_node1/TASK: Get the fingerprint comment in ifcfg-bond0.1 [12673a56-9f93-f197-7423-000000000448] 11579 1726882193.09023: sending task result for task 12673a56-9f93-f197-7423-000000000448 11579 1726882193.09096: done sending task result for task 12673a56-9f93-f197-7423-000000000448 11579 1726882193.09099: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11579 1726882193.09164: no more pending results, returning what we have 11579 1726882193.09168: results queue empty 11579 1726882193.09169: checking for any_errors_fatal 11579 1726882193.09175: done checking for any_errors_fatal 11579 1726882193.09175: checking for max_fail_percentage 11579 1726882193.09178: done checking for max_fail_percentage 11579 1726882193.09178: checking to see if all hosts have failed and the running result is not ok 11579 1726882193.09179: done checking to see if all hosts have failed 11579 1726882193.09180: getting the remaining hosts for this loop 11579 1726882193.09182: done getting the remaining hosts for this loop 11579 1726882193.09186: getting the next task for host managed_node1 11579 1726882193.09196: done getting next task for host managed_node1 11579 1726882193.09199: ^ task is: TASK: Verify the fingerprint comment in ifcfg-{{ profile }} 11579 1726882193.09203: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882193.09208: getting variables 11579 1726882193.09210: in VariableManager get_vars() 11579 1726882193.09258: Calling all_inventory to load vars for managed_node1 11579 1726882193.09260: Calling groups_inventory to load vars for managed_node1 11579 1726882193.09263: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882193.09277: Calling all_plugins_play to load vars for managed_node1 11579 1726882193.09280: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882193.09283: Calling groups_plugins_play to load vars for managed_node1 11579 1726882193.11121: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882193.12770: done with get_vars() 11579 1726882193.12799: done getting variables 11579 1726882193.12864: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882193.12988: variable 'profile' from source: include params 11579 1726882193.12996: variable 'item' from source: include params 11579 1726882193.13060: variable 'item' from source: include params TASK [Verify the fingerprint comment in ifcfg-bond0.1] ************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/get_profile_stat.yml:69 Friday 20 September 2024 21:29:53 -0400 (0:00:00.072) 0:00:21.839 ****** 11579 1726882193.13091: entering _queue_task() for managed_node1/set_fact 11579 1726882193.13454: worker is 1 (out of 1 available) 11579 1726882193.13470: exiting _queue_task() for managed_node1/set_fact 11579 1726882193.13482: done queuing things up, now waiting for results queue to drain 11579 1726882193.13483: waiting for pending results... 11579 1726882193.14600: running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 11579 1726882193.14605: in run() - task 12673a56-9f93-f197-7423-000000000449 11579 1726882193.14609: variable 'ansible_search_path' from source: unknown 11579 1726882193.14611: variable 'ansible_search_path' from source: unknown 11579 1726882193.15001: calling self._execute() 11579 1726882193.15005: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.15009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.15012: variable 'omit' from source: magic vars 11579 1726882193.15724: variable 'ansible_distribution_major_version' from source: facts 11579 1726882193.15999: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882193.16040: variable 'profile_stat' from source: set_fact 11579 1726882193.16063: Evaluated conditional (profile_stat.stat.exists): False 11579 1726882193.16072: when evaluation is False, skipping this task 11579 1726882193.16399: _execute() done 11579 1726882193.16402: dumping result to json 11579 1726882193.16405: done dumping result, returning 11579 1726882193.16408: done running TaskExecutor() for managed_node1/TASK: Verify the fingerprint comment in ifcfg-bond0.1 [12673a56-9f93-f197-7423-000000000449] 11579 1726882193.16410: sending task result for task 12673a56-9f93-f197-7423-000000000449 11579 1726882193.16486: done sending task result for task 12673a56-9f93-f197-7423-000000000449 11579 1726882193.16490: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "profile_stat.stat.exists", "skip_reason": "Conditional result was False" } 11579 1726882193.16540: no more pending results, returning what we have 11579 1726882193.16544: results queue empty 11579 1726882193.16545: checking for any_errors_fatal 11579 1726882193.16550: done checking for any_errors_fatal 11579 1726882193.16551: checking for max_fail_percentage 11579 1726882193.16552: done checking for max_fail_percentage 11579 1726882193.16553: checking to see if all hosts have failed and the running result is not ok 11579 1726882193.16554: done checking to see if all hosts have failed 11579 1726882193.16555: getting the remaining hosts for this loop 11579 1726882193.16556: done getting the remaining hosts for this loop 11579 1726882193.16559: getting the next task for host managed_node1 11579 1726882193.16568: done getting next task for host managed_node1 11579 1726882193.16571: ^ task is: TASK: Assert that the profile is present - '{{ profile }}' 11579 1726882193.16574: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882193.16577: getting variables 11579 1726882193.16579: in VariableManager get_vars() 11579 1726882193.16628: Calling all_inventory to load vars for managed_node1 11579 1726882193.16630: Calling groups_inventory to load vars for managed_node1 11579 1726882193.16633: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882193.16645: Calling all_plugins_play to load vars for managed_node1 11579 1726882193.16648: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882193.16651: Calling groups_plugins_play to load vars for managed_node1 11579 1726882193.19681: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882193.22605: done with get_vars() 11579 1726882193.22718: done getting variables 11579 1726882193.22782: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882193.22912: variable 'profile' from source: include params 11579 1726882193.22916: variable 'item' from source: include params 11579 1726882193.22981: variable 'item' from source: include params TASK [Assert that the profile is present - 'bond0.1'] ************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:5 Friday 20 September 2024 21:29:53 -0400 (0:00:00.100) 0:00:21.939 ****** 11579 1726882193.23139: entering _queue_task() for managed_node1/assert 11579 1726882193.23837: worker is 1 (out of 1 available) 11579 1726882193.23849: exiting _queue_task() for managed_node1/assert 11579 1726882193.23860: done queuing things up, now waiting for results queue to drain 11579 1726882193.23861: waiting for pending results... 11579 1726882193.24261: running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.1' 11579 1726882193.24366: in run() - task 12673a56-9f93-f197-7423-00000000026e 11579 1726882193.24408: variable 'ansible_search_path' from source: unknown 11579 1726882193.24412: variable 'ansible_search_path' from source: unknown 11579 1726882193.24435: calling self._execute() 11579 1726882193.24621: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.24628: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.24638: variable 'omit' from source: magic vars 11579 1726882193.25507: variable 'ansible_distribution_major_version' from source: facts 11579 1726882193.25537: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882193.25542: variable 'omit' from source: magic vars 11579 1726882193.25647: variable 'omit' from source: magic vars 11579 1726882193.25784: variable 'profile' from source: include params 11579 1726882193.25788: variable 'item' from source: include params 11579 1726882193.25992: variable 'item' from source: include params 11579 1726882193.26013: variable 'omit' from source: magic vars 11579 1726882193.26121: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882193.26205: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882193.26236: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882193.26249: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882193.26410: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882193.26414: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882193.26417: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.26420: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.26492: Set connection var ansible_timeout to 10 11579 1726882193.26500: Set connection var ansible_shell_type to sh 11579 1726882193.26508: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882193.26517: Set connection var ansible_shell_executable to /bin/sh 11579 1726882193.26520: Set connection var ansible_pipelining to False 11579 1726882193.26523: Set connection var ansible_connection to ssh 11579 1726882193.26547: variable 'ansible_shell_executable' from source: unknown 11579 1726882193.26550: variable 'ansible_connection' from source: unknown 11579 1726882193.26553: variable 'ansible_module_compression' from source: unknown 11579 1726882193.26555: variable 'ansible_shell_type' from source: unknown 11579 1726882193.26558: variable 'ansible_shell_executable' from source: unknown 11579 1726882193.26561: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.26563: variable 'ansible_pipelining' from source: unknown 11579 1726882193.26566: variable 'ansible_timeout' from source: unknown 11579 1726882193.26568: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.26737: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882193.26740: variable 'omit' from source: magic vars 11579 1726882193.26743: starting attempt loop 11579 1726882193.26745: running the handler 11579 1726882193.26850: variable 'lsr_net_profile_exists' from source: set_fact 11579 1726882193.26853: Evaluated conditional (lsr_net_profile_exists): True 11579 1726882193.26900: handler run complete 11579 1726882193.26903: attempt loop complete, returning result 11579 1726882193.26905: _execute() done 11579 1726882193.26908: dumping result to json 11579 1726882193.26910: done dumping result, returning 11579 1726882193.26913: done running TaskExecutor() for managed_node1/TASK: Assert that the profile is present - 'bond0.1' [12673a56-9f93-f197-7423-00000000026e] 11579 1726882193.26916: sending task result for task 12673a56-9f93-f197-7423-00000000026e 11579 1726882193.27115: done sending task result for task 12673a56-9f93-f197-7423-00000000026e 11579 1726882193.27118: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11579 1726882193.27168: no more pending results, returning what we have 11579 1726882193.27171: results queue empty 11579 1726882193.27172: checking for any_errors_fatal 11579 1726882193.27178: done checking for any_errors_fatal 11579 1726882193.27179: checking for max_fail_percentage 11579 1726882193.27181: done checking for max_fail_percentage 11579 1726882193.27182: checking to see if all hosts have failed and the running result is not ok 11579 1726882193.27183: done checking to see if all hosts have failed 11579 1726882193.27183: getting the remaining hosts for this loop 11579 1726882193.27185: done getting the remaining hosts for this loop 11579 1726882193.27189: getting the next task for host managed_node1 11579 1726882193.27198: done getting next task for host managed_node1 11579 1726882193.27201: ^ task is: TASK: Assert that the ansible managed comment is present in '{{ profile }}' 11579 1726882193.27204: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882193.27208: getting variables 11579 1726882193.27210: in VariableManager get_vars() 11579 1726882193.27259: Calling all_inventory to load vars for managed_node1 11579 1726882193.27262: Calling groups_inventory to load vars for managed_node1 11579 1726882193.27265: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882193.27277: Calling all_plugins_play to load vars for managed_node1 11579 1726882193.27280: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882193.27284: Calling groups_plugins_play to load vars for managed_node1 11579 1726882193.29488: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882193.31059: done with get_vars() 11579 1726882193.31092: done getting variables 11579 1726882193.31158: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882193.31280: variable 'profile' from source: include params 11579 1726882193.31286: variable 'item' from source: include params 11579 1726882193.31349: variable 'item' from source: include params TASK [Assert that the ansible managed comment is present in 'bond0.1'] ********* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:10 Friday 20 September 2024 21:29:53 -0400 (0:00:00.082) 0:00:22.022 ****** 11579 1726882193.31385: entering _queue_task() for managed_node1/assert 11579 1726882193.31927: worker is 1 (out of 1 available) 11579 1726882193.31937: exiting _queue_task() for managed_node1/assert 11579 1726882193.31947: done queuing things up, now waiting for results queue to drain 11579 1726882193.31948: waiting for pending results... 11579 1726882193.32052: running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' 11579 1726882193.32177: in run() - task 12673a56-9f93-f197-7423-00000000026f 11579 1726882193.32205: variable 'ansible_search_path' from source: unknown 11579 1726882193.32285: variable 'ansible_search_path' from source: unknown 11579 1726882193.32289: calling self._execute() 11579 1726882193.32408: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.32424: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.32441: variable 'omit' from source: magic vars 11579 1726882193.32900: variable 'ansible_distribution_major_version' from source: facts 11579 1726882193.32924: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882193.32955: variable 'omit' from source: magic vars 11579 1726882193.33007: variable 'omit' from source: magic vars 11579 1726882193.33150: variable 'profile' from source: include params 11579 1726882193.33170: variable 'item' from source: include params 11579 1726882193.33287: variable 'item' from source: include params 11579 1726882193.33291: variable 'omit' from source: magic vars 11579 1726882193.33338: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882193.33384: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882193.33413: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882193.33436: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882193.33488: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882193.33491: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882193.33498: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.33507: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.33618: Set connection var ansible_timeout to 10 11579 1726882193.33635: Set connection var ansible_shell_type to sh 11579 1726882193.33648: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882193.33720: Set connection var ansible_shell_executable to /bin/sh 11579 1726882193.33723: Set connection var ansible_pipelining to False 11579 1726882193.33726: Set connection var ansible_connection to ssh 11579 1726882193.33728: variable 'ansible_shell_executable' from source: unknown 11579 1726882193.33730: variable 'ansible_connection' from source: unknown 11579 1726882193.33732: variable 'ansible_module_compression' from source: unknown 11579 1726882193.33734: variable 'ansible_shell_type' from source: unknown 11579 1726882193.33736: variable 'ansible_shell_executable' from source: unknown 11579 1726882193.33748: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.33768: variable 'ansible_pipelining' from source: unknown 11579 1726882193.33777: variable 'ansible_timeout' from source: unknown 11579 1726882193.33784: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.33957: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882193.33999: variable 'omit' from source: magic vars 11579 1726882193.34002: starting attempt loop 11579 1726882193.34005: running the handler 11579 1726882193.34112: variable 'lsr_net_profile_ansible_managed' from source: set_fact 11579 1726882193.34123: Evaluated conditional (lsr_net_profile_ansible_managed): True 11579 1726882193.34133: handler run complete 11579 1726882193.34200: attempt loop complete, returning result 11579 1726882193.34203: _execute() done 11579 1726882193.34206: dumping result to json 11579 1726882193.34208: done dumping result, returning 11579 1726882193.34210: done running TaskExecutor() for managed_node1/TASK: Assert that the ansible managed comment is present in 'bond0.1' [12673a56-9f93-f197-7423-00000000026f] 11579 1726882193.34212: sending task result for task 12673a56-9f93-f197-7423-00000000026f 11579 1726882193.34500: done sending task result for task 12673a56-9f93-f197-7423-00000000026f 11579 1726882193.34504: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11579 1726882193.34646: no more pending results, returning what we have 11579 1726882193.34650: results queue empty 11579 1726882193.34651: checking for any_errors_fatal 11579 1726882193.34660: done checking for any_errors_fatal 11579 1726882193.34661: checking for max_fail_percentage 11579 1726882193.34663: done checking for max_fail_percentage 11579 1726882193.34664: checking to see if all hosts have failed and the running result is not ok 11579 1726882193.34665: done checking to see if all hosts have failed 11579 1726882193.34666: getting the remaining hosts for this loop 11579 1726882193.34667: done getting the remaining hosts for this loop 11579 1726882193.34671: getting the next task for host managed_node1 11579 1726882193.34679: done getting next task for host managed_node1 11579 1726882193.34681: ^ task is: TASK: Assert that the fingerprint comment is present in {{ profile }} 11579 1726882193.34684: ^ state is: HOST STATE: block=2, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882193.34689: getting variables 11579 1726882193.34690: in VariableManager get_vars() 11579 1726882193.34742: Calling all_inventory to load vars for managed_node1 11579 1726882193.34746: Calling groups_inventory to load vars for managed_node1 11579 1726882193.34749: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882193.34761: Calling all_plugins_play to load vars for managed_node1 11579 1726882193.34764: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882193.34767: Calling groups_plugins_play to load vars for managed_node1 11579 1726882193.37933: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882193.39976: done with get_vars() 11579 1726882193.40009: done getting variables 11579 1726882193.40071: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882193.40181: variable 'profile' from source: include params 11579 1726882193.40185: variable 'item' from source: include params 11579 1726882193.40243: variable 'item' from source: include params TASK [Assert that the fingerprint comment is present in bond0.1] *************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/assert_profile_present.yml:15 Friday 20 September 2024 21:29:53 -0400 (0:00:00.088) 0:00:22.111 ****** 11579 1726882193.40280: entering _queue_task() for managed_node1/assert 11579 1726882193.40633: worker is 1 (out of 1 available) 11579 1726882193.40646: exiting _queue_task() for managed_node1/assert 11579 1726882193.40657: done queuing things up, now waiting for results queue to drain 11579 1726882193.40658: waiting for pending results... 11579 1726882193.40940: running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.1 11579 1726882193.41053: in run() - task 12673a56-9f93-f197-7423-000000000270 11579 1726882193.41076: variable 'ansible_search_path' from source: unknown 11579 1726882193.41089: variable 'ansible_search_path' from source: unknown 11579 1726882193.41301: calling self._execute() 11579 1726882193.41305: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.41308: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.41311: variable 'omit' from source: magic vars 11579 1726882193.41625: variable 'ansible_distribution_major_version' from source: facts 11579 1726882193.41647: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882193.41658: variable 'omit' from source: magic vars 11579 1726882193.41704: variable 'omit' from source: magic vars 11579 1726882193.41808: variable 'profile' from source: include params 11579 1726882193.41819: variable 'item' from source: include params 11579 1726882193.41886: variable 'item' from source: include params 11579 1726882193.41914: variable 'omit' from source: magic vars 11579 1726882193.41958: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882193.42008: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882193.42033: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882193.42056: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882193.42073: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882193.42115: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882193.42123: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.42131: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.42242: Set connection var ansible_timeout to 10 11579 1726882193.42255: Set connection var ansible_shell_type to sh 11579 1726882193.42300: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882193.42304: Set connection var ansible_shell_executable to /bin/sh 11579 1726882193.42307: Set connection var ansible_pipelining to False 11579 1726882193.42309: Set connection var ansible_connection to ssh 11579 1726882193.42322: variable 'ansible_shell_executable' from source: unknown 11579 1726882193.42330: variable 'ansible_connection' from source: unknown 11579 1726882193.42337: variable 'ansible_module_compression' from source: unknown 11579 1726882193.42345: variable 'ansible_shell_type' from source: unknown 11579 1726882193.42352: variable 'ansible_shell_executable' from source: unknown 11579 1726882193.42409: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.42412: variable 'ansible_pipelining' from source: unknown 11579 1726882193.42415: variable 'ansible_timeout' from source: unknown 11579 1726882193.42418: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.42521: Loading ActionModule 'assert' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/assert.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882193.42540: variable 'omit' from source: magic vars 11579 1726882193.42551: starting attempt loop 11579 1726882193.42557: running the handler 11579 1726882193.42667: variable 'lsr_net_profile_fingerprint' from source: set_fact 11579 1726882193.42677: Evaluated conditional (lsr_net_profile_fingerprint): True 11579 1726882193.42699: handler run complete 11579 1726882193.42711: attempt loop complete, returning result 11579 1726882193.42718: _execute() done 11579 1726882193.42732: dumping result to json 11579 1726882193.42841: done dumping result, returning 11579 1726882193.42845: done running TaskExecutor() for managed_node1/TASK: Assert that the fingerprint comment is present in bond0.1 [12673a56-9f93-f197-7423-000000000270] 11579 1726882193.42847: sending task result for task 12673a56-9f93-f197-7423-000000000270 11579 1726882193.42919: done sending task result for task 12673a56-9f93-f197-7423-000000000270 11579 1726882193.42922: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false } MSG: All assertions passed 11579 1726882193.42991: no more pending results, returning what we have 11579 1726882193.42998: results queue empty 11579 1726882193.42999: checking for any_errors_fatal 11579 1726882193.43006: done checking for any_errors_fatal 11579 1726882193.43006: checking for max_fail_percentage 11579 1726882193.43008: done checking for max_fail_percentage 11579 1726882193.43009: checking to see if all hosts have failed and the running result is not ok 11579 1726882193.43010: done checking to see if all hosts have failed 11579 1726882193.43011: getting the remaining hosts for this loop 11579 1726882193.43013: done getting the remaining hosts for this loop 11579 1726882193.43016: getting the next task for host managed_node1 11579 1726882193.43025: done getting next task for host managed_node1 11579 1726882193.43027: ^ task is: TASK: ** TEST check polling interval 11579 1726882193.43029: ^ state is: HOST STATE: block=2, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882193.43034: getting variables 11579 1726882193.43036: in VariableManager get_vars() 11579 1726882193.43081: Calling all_inventory to load vars for managed_node1 11579 1726882193.43084: Calling groups_inventory to load vars for managed_node1 11579 1726882193.43088: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882193.43205: Calling all_plugins_play to load vars for managed_node1 11579 1726882193.43209: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882193.43213: Calling groups_plugins_play to load vars for managed_node1 11579 1726882193.44615: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882193.46327: done with get_vars() 11579 1726882193.46347: done getting variables 11579 1726882193.46408: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check polling interval] ****************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:75 Friday 20 September 2024 21:29:53 -0400 (0:00:00.061) 0:00:22.172 ****** 11579 1726882193.46436: entering _queue_task() for managed_node1/command 11579 1726882193.46835: worker is 1 (out of 1 available) 11579 1726882193.46847: exiting _queue_task() for managed_node1/command 11579 1726882193.46857: done queuing things up, now waiting for results queue to drain 11579 1726882193.46858: waiting for pending results... 11579 1726882193.47038: running TaskExecutor() for managed_node1/TASK: ** TEST check polling interval 11579 1726882193.47191: in run() - task 12673a56-9f93-f197-7423-000000000071 11579 1726882193.47200: variable 'ansible_search_path' from source: unknown 11579 1726882193.47209: calling self._execute() 11579 1726882193.47318: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.47334: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.47352: variable 'omit' from source: magic vars 11579 1726882193.47727: variable 'ansible_distribution_major_version' from source: facts 11579 1726882193.47801: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882193.47804: variable 'omit' from source: magic vars 11579 1726882193.47806: variable 'omit' from source: magic vars 11579 1726882193.47880: variable 'controller_device' from source: play vars 11579 1726882193.47907: variable 'omit' from source: magic vars 11579 1726882193.47998: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882193.48044: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882193.48068: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882193.48096: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882193.48113: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882193.48200: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882193.48203: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.48206: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.48274: Set connection var ansible_timeout to 10 11579 1726882193.48286: Set connection var ansible_shell_type to sh 11579 1726882193.48303: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882193.48314: Set connection var ansible_shell_executable to /bin/sh 11579 1726882193.48327: Set connection var ansible_pipelining to False 11579 1726882193.48334: Set connection var ansible_connection to ssh 11579 1726882193.48362: variable 'ansible_shell_executable' from source: unknown 11579 1726882193.48370: variable 'ansible_connection' from source: unknown 11579 1726882193.48462: variable 'ansible_module_compression' from source: unknown 11579 1726882193.48465: variable 'ansible_shell_type' from source: unknown 11579 1726882193.48468: variable 'ansible_shell_executable' from source: unknown 11579 1726882193.48469: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.48471: variable 'ansible_pipelining' from source: unknown 11579 1726882193.48473: variable 'ansible_timeout' from source: unknown 11579 1726882193.48475: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.48550: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882193.48569: variable 'omit' from source: magic vars 11579 1726882193.48578: starting attempt loop 11579 1726882193.48589: running the handler 11579 1726882193.48613: _low_level_execute_command(): starting 11579 1726882193.48626: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882193.49417: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882193.49471: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882193.49489: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882193.49517: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882193.49607: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882193.51300: stdout chunk (state=3): >>>/root <<< 11579 1726882193.51462: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882193.51467: stdout chunk (state=3): >>><<< 11579 1726882193.51469: stderr chunk (state=3): >>><<< 11579 1726882193.51601: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882193.51606: _low_level_execute_command(): starting 11579 1726882193.51609: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882193.5150156-12640-261764729533373 `" && echo ansible-tmp-1726882193.5150156-12640-261764729533373="` echo /root/.ansible/tmp/ansible-tmp-1726882193.5150156-12640-261764729533373 `" ) && sleep 0' 11579 1726882193.52206: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882193.52220: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882193.52284: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882193.52360: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882193.52401: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882193.52474: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882193.54702: stdout chunk (state=3): >>>ansible-tmp-1726882193.5150156-12640-261764729533373=/root/.ansible/tmp/ansible-tmp-1726882193.5150156-12640-261764729533373 <<< 11579 1726882193.54705: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882193.54708: stdout chunk (state=3): >>><<< 11579 1726882193.54710: stderr chunk (state=3): >>><<< 11579 1726882193.54713: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882193.5150156-12640-261764729533373=/root/.ansible/tmp/ansible-tmp-1726882193.5150156-12640-261764729533373 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882193.54715: variable 'ansible_module_compression' from source: unknown 11579 1726882193.54756: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11579 1726882193.54847: variable 'ansible_facts' from source: unknown 11579 1726882193.55015: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882193.5150156-12640-261764729533373/AnsiballZ_command.py 11579 1726882193.55287: Sending initial data 11579 1726882193.55291: Sent initial data (156 bytes) 11579 1726882193.57172: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882193.57228: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882193.57343: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882193.58869: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 11579 1726882193.58912: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882193.58977: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882193.5150156-12640-261764729533373/AnsiballZ_command.py" <<< 11579 1726882193.59009: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmp5nw1yth7 /root/.ansible/tmp/ansible-tmp-1726882193.5150156-12640-261764729533373/AnsiballZ_command.py <<< 11579 1726882193.59062: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmp5nw1yth7" to remote "/root/.ansible/tmp/ansible-tmp-1726882193.5150156-12640-261764729533373/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882193.5150156-12640-261764729533373/AnsiballZ_command.py" <<< 11579 1726882193.61853: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882193.61857: stdout chunk (state=3): >>><<< 11579 1726882193.61860: stderr chunk (state=3): >>><<< 11579 1726882193.61862: done transferring module to remote 11579 1726882193.61864: _low_level_execute_command(): starting 11579 1726882193.61867: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882193.5150156-12640-261764729533373/ /root/.ansible/tmp/ansible-tmp-1726882193.5150156-12640-261764729533373/AnsiballZ_command.py && sleep 0' 11579 1726882193.63150: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882193.63154: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11579 1726882193.63158: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882193.63160: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882193.63239: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882193.63511: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882193.65397: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882193.65401: stdout chunk (state=3): >>><<< 11579 1726882193.65403: stderr chunk (state=3): >>><<< 11579 1726882193.65405: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882193.65411: _low_level_execute_command(): starting 11579 1726882193.65413: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882193.5150156-12640-261764729533373/AnsiballZ_command.py && sleep 0' 11579 1726882193.66711: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882193.66957: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882193.66970: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882193.67092: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882193.82264: stdout chunk (state=3): >>> {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 21:29:53.817995", "end": "2024-09-20 21:29:53.821195", "delta": "0:00:00.003200", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11579 1726882193.83644: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882193.83648: stdout chunk (state=3): >>><<< 11579 1726882193.83821: stderr chunk (state=3): >>><<< 11579 1726882193.83824: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "MII Polling Interval (ms): 110", "stderr": "", "rc": 0, "cmd": ["grep", "Polling Interval", "/proc/net/bonding/nm-bond"], "start": "2024-09-20 21:29:53.817995", "end": "2024-09-20 21:29:53.821195", "delta": "0:00:00.003200", "msg": "", "invocation": {"module_args": {"_raw_params": "grep 'Polling Interval' /proc/net/bonding/nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882193.83828: done with _execute_module (ansible.legacy.command, {'_raw_params': "grep 'Polling Interval' /proc/net/bonding/nm-bond", '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882193.5150156-12640-261764729533373/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882193.83831: _low_level_execute_command(): starting 11579 1726882193.83834: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882193.5150156-12640-261764729533373/ > /dev/null 2>&1 && sleep 0' 11579 1726882193.84802: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882193.85097: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882193.85114: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882193.85126: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882193.85190: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882193.87119: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882193.87123: stdout chunk (state=3): >>><<< 11579 1726882193.87125: stderr chunk (state=3): >>><<< 11579 1726882193.87127: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882193.87133: handler run complete 11579 1726882193.87135: Evaluated conditional (False): False 11579 1726882193.87270: variable 'result' from source: unknown 11579 1726882193.87286: Evaluated conditional ('110' in result.stdout): True 11579 1726882193.87504: attempt loop complete, returning result 11579 1726882193.87507: _execute() done 11579 1726882193.87510: dumping result to json 11579 1726882193.87512: done dumping result, returning 11579 1726882193.87522: done running TaskExecutor() for managed_node1/TASK: ** TEST check polling interval [12673a56-9f93-f197-7423-000000000071] 11579 1726882193.87527: sending task result for task 12673a56-9f93-f197-7423-000000000071 11579 1726882193.87900: done sending task result for task 12673a56-9f93-f197-7423-000000000071 11579 1726882193.87903: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "cmd": [ "grep", "Polling Interval", "/proc/net/bonding/nm-bond" ], "delta": "0:00:00.003200", "end": "2024-09-20 21:29:53.821195", "rc": 0, "start": "2024-09-20 21:29:53.817995" } STDOUT: MII Polling Interval (ms): 110 11579 1726882193.87982: no more pending results, returning what we have 11579 1726882193.87985: results queue empty 11579 1726882193.87986: checking for any_errors_fatal 11579 1726882193.87990: done checking for any_errors_fatal 11579 1726882193.87991: checking for max_fail_percentage 11579 1726882193.87997: done checking for max_fail_percentage 11579 1726882193.87998: checking to see if all hosts have failed and the running result is not ok 11579 1726882193.87999: done checking to see if all hosts have failed 11579 1726882193.87999: getting the remaining hosts for this loop 11579 1726882193.88001: done getting the remaining hosts for this loop 11579 1726882193.88005: getting the next task for host managed_node1 11579 1726882193.88011: done getting next task for host managed_node1 11579 1726882193.88018: ^ task is: TASK: ** TEST check IPv4 11579 1726882193.88021: ^ state is: HOST STATE: block=2, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882193.88024: getting variables 11579 1726882193.88026: in VariableManager get_vars() 11579 1726882193.88067: Calling all_inventory to load vars for managed_node1 11579 1726882193.88069: Calling groups_inventory to load vars for managed_node1 11579 1726882193.88072: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882193.88083: Calling all_plugins_play to load vars for managed_node1 11579 1726882193.88086: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882193.88089: Calling groups_plugins_play to load vars for managed_node1 11579 1726882193.91158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882193.94824: done with get_vars() 11579 1726882193.94854: done getting variables 11579 1726882193.94950: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv4] ****************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:80 Friday 20 September 2024 21:29:53 -0400 (0:00:00.486) 0:00:22.659 ****** 11579 1726882193.95097: entering _queue_task() for managed_node1/command 11579 1726882193.95791: worker is 1 (out of 1 available) 11579 1726882193.95806: exiting _queue_task() for managed_node1/command 11579 1726882193.95818: done queuing things up, now waiting for results queue to drain 11579 1726882193.95819: waiting for pending results... 11579 1726882193.96513: running TaskExecutor() for managed_node1/TASK: ** TEST check IPv4 11579 1726882193.96787: in run() - task 12673a56-9f93-f197-7423-000000000072 11579 1726882193.96791: variable 'ansible_search_path' from source: unknown 11579 1726882193.96885: calling self._execute() 11579 1726882193.96889: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.96896: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.96944: variable 'omit' from source: magic vars 11579 1726882193.97686: variable 'ansible_distribution_major_version' from source: facts 11579 1726882193.97702: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882193.97708: variable 'omit' from source: magic vars 11579 1726882193.97816: variable 'omit' from source: magic vars 11579 1726882193.98067: variable 'controller_device' from source: play vars 11579 1726882193.98085: variable 'omit' from source: magic vars 11579 1726882193.98128: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882193.98215: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882193.98235: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882193.98253: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882193.98382: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882193.98414: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882193.98419: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.98422: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.98640: Set connection var ansible_timeout to 10 11579 1726882193.98646: Set connection var ansible_shell_type to sh 11579 1726882193.98654: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882193.98659: Set connection var ansible_shell_executable to /bin/sh 11579 1726882193.98667: Set connection var ansible_pipelining to False 11579 1726882193.98670: Set connection var ansible_connection to ssh 11579 1726882193.98800: variable 'ansible_shell_executable' from source: unknown 11579 1726882193.98807: variable 'ansible_connection' from source: unknown 11579 1726882193.98814: variable 'ansible_module_compression' from source: unknown 11579 1726882193.98817: variable 'ansible_shell_type' from source: unknown 11579 1726882193.98822: variable 'ansible_shell_executable' from source: unknown 11579 1726882193.98825: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882193.98830: variable 'ansible_pipelining' from source: unknown 11579 1726882193.98833: variable 'ansible_timeout' from source: unknown 11579 1726882193.98837: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882193.99065: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882193.99078: variable 'omit' from source: magic vars 11579 1726882193.99084: starting attempt loop 11579 1726882193.99087: running the handler 11579 1726882193.99255: _low_level_execute_command(): starting 11579 1726882193.99263: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882194.00571: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882194.00581: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882194.00842: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882194.02444: stdout chunk (state=3): >>>/root <<< 11579 1726882194.02801: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882194.02804: stdout chunk (state=3): >>><<< 11579 1726882194.02807: stderr chunk (state=3): >>><<< 11579 1726882194.02810: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882194.02813: _low_level_execute_command(): starting 11579 1726882194.02816: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882194.0267248-12666-196289785892988 `" && echo ansible-tmp-1726882194.0267248-12666-196289785892988="` echo /root/.ansible/tmp/ansible-tmp-1726882194.0267248-12666-196289785892988 `" ) && sleep 0' 11579 1726882194.03938: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882194.03947: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 11579 1726882194.04259: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882194.04269: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882194.04280: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882194.04299: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882194.04368: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882194.06222: stdout chunk (state=3): >>>ansible-tmp-1726882194.0267248-12666-196289785892988=/root/.ansible/tmp/ansible-tmp-1726882194.0267248-12666-196289785892988 <<< 11579 1726882194.06359: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882194.06363: stderr chunk (state=3): >>><<< 11579 1726882194.06365: stdout chunk (state=3): >>><<< 11579 1726882194.06389: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882194.0267248-12666-196289785892988=/root/.ansible/tmp/ansible-tmp-1726882194.0267248-12666-196289785892988 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882194.06502: variable 'ansible_module_compression' from source: unknown 11579 1726882194.06559: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11579 1726882194.06600: variable 'ansible_facts' from source: unknown 11579 1726882194.06738: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882194.0267248-12666-196289785892988/AnsiballZ_command.py 11579 1726882194.07401: Sending initial data 11579 1726882194.07405: Sent initial data (156 bytes) 11579 1726882194.08142: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882194.08148: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882194.08211: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882194.08249: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882194.08252: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882194.08429: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882194.08536: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882194.09983: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 <<< 11579 1726882194.09987: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882194.10024: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882194.10266: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpw10j92c9 /root/.ansible/tmp/ansible-tmp-1726882194.0267248-12666-196289785892988/AnsiballZ_command.py <<< 11579 1726882194.10281: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882194.0267248-12666-196289785892988/AnsiballZ_command.py" debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpw10j92c9" to remote "/root/.ansible/tmp/ansible-tmp-1726882194.0267248-12666-196289785892988/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882194.0267248-12666-196289785892988/AnsiballZ_command.py" <<< 11579 1726882194.11384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882194.11483: stderr chunk (state=3): >>><<< 11579 1726882194.11486: stdout chunk (state=3): >>><<< 11579 1726882194.11510: done transferring module to remote 11579 1726882194.11519: _low_level_execute_command(): starting 11579 1726882194.11523: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882194.0267248-12666-196289785892988/ /root/.ansible/tmp/ansible-tmp-1726882194.0267248-12666-196289785892988/AnsiballZ_command.py && sleep 0' 11579 1726882194.12702: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882194.12708: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882194.12901: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882194.12924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882194.13024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882194.14946: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882194.14950: stderr chunk (state=3): >>><<< 11579 1726882194.14952: stdout chunk (state=3): >>><<< 11579 1726882194.14977: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882194.14980: _low_level_execute_command(): starting 11579 1726882194.14985: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882194.0267248-12666-196289785892988/AnsiballZ_command.py && sleep 0' 11579 1726882194.16411: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882194.16661: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882194.16707: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882194.31833: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.244/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:29:54.313514", "end": "2024-09-20 21:29:54.316853", "delta": "0:00:00.003339", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11579 1726882194.33262: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882194.33266: stdout chunk (state=3): >>><<< 11579 1726882194.33269: stderr chunk (state=3): >>><<< 11579 1726882194.33291: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet 192.0.2.244/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond\n valid_lft 236sec preferred_lft 236sec", "stderr": "", "rc": 0, "cmd": ["ip", "-4", "a", "s", "nm-bond"], "start": "2024-09-20 21:29:54.313514", "end": "2024-09-20 21:29:54.316853", "delta": "0:00:00.003339", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -4 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882194.33337: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -4 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882194.0267248-12666-196289785892988/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882194.33801: _low_level_execute_command(): starting 11579 1726882194.33804: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882194.0267248-12666-196289785892988/ > /dev/null 2>&1 && sleep 0' 11579 1726882194.34810: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882194.34890: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882194.34958: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882194.36734: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882194.36764: stderr chunk (state=3): >>><<< 11579 1726882194.36773: stdout chunk (state=3): >>><<< 11579 1726882194.36812: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882194.36826: handler run complete 11579 1726882194.36853: Evaluated conditional (False): False 11579 1726882194.37291: variable 'result' from source: set_fact 11579 1726882194.37699: Evaluated conditional ('192.0.2' in result.stdout): True 11579 1726882194.37703: attempt loop complete, returning result 11579 1726882194.37705: _execute() done 11579 1726882194.37707: dumping result to json 11579 1726882194.37709: done dumping result, returning 11579 1726882194.37711: done running TaskExecutor() for managed_node1/TASK: ** TEST check IPv4 [12673a56-9f93-f197-7423-000000000072] 11579 1726882194.37713: sending task result for task 12673a56-9f93-f197-7423-000000000072 11579 1726882194.37784: done sending task result for task 12673a56-9f93-f197-7423-000000000072 11579 1726882194.37787: WORKER PROCESS EXITING ok: [managed_node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-4", "a", "s", "nm-bond" ], "delta": "0:00:00.003339", "end": "2024-09-20 21:29:54.316853", "rc": 0, "start": "2024-09-20 21:29:54.313514" } STDOUT: 13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet 192.0.2.244/24 brd 192.0.2.255 scope global dynamic noprefixroute nm-bond valid_lft 236sec preferred_lft 236sec 11579 1726882194.37876: no more pending results, returning what we have 11579 1726882194.37880: results queue empty 11579 1726882194.37881: checking for any_errors_fatal 11579 1726882194.37896: done checking for any_errors_fatal 11579 1726882194.37897: checking for max_fail_percentage 11579 1726882194.37900: done checking for max_fail_percentage 11579 1726882194.37901: checking to see if all hosts have failed and the running result is not ok 11579 1726882194.37902: done checking to see if all hosts have failed 11579 1726882194.37903: getting the remaining hosts for this loop 11579 1726882194.37905: done getting the remaining hosts for this loop 11579 1726882194.37908: getting the next task for host managed_node1 11579 1726882194.37914: done getting next task for host managed_node1 11579 1726882194.37916: ^ task is: TASK: ** TEST check IPv6 11579 1726882194.37918: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882194.37921: getting variables 11579 1726882194.37923: in VariableManager get_vars() 11579 1726882194.37962: Calling all_inventory to load vars for managed_node1 11579 1726882194.37964: Calling groups_inventory to load vars for managed_node1 11579 1726882194.37966: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882194.37976: Calling all_plugins_play to load vars for managed_node1 11579 1726882194.37979: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882194.37981: Calling groups_plugins_play to load vars for managed_node1 11579 1726882194.42083: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882194.46949: done with get_vars() 11579 1726882194.47314: done getting variables 11579 1726882194.47371: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [** TEST check IPv6] ****************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:87 Friday 20 September 2024 21:29:54 -0400 (0:00:00.523) 0:00:23.183 ****** 11579 1726882194.47507: entering _queue_task() for managed_node1/command 11579 1726882194.48660: worker is 1 (out of 1 available) 11579 1726882194.48672: exiting _queue_task() for managed_node1/command 11579 1726882194.48683: done queuing things up, now waiting for results queue to drain 11579 1726882194.48684: waiting for pending results... 11579 1726882194.49240: running TaskExecutor() for managed_node1/TASK: ** TEST check IPv6 11579 1726882194.49555: in run() - task 12673a56-9f93-f197-7423-000000000073 11579 1726882194.49558: variable 'ansible_search_path' from source: unknown 11579 1726882194.49561: calling self._execute() 11579 1726882194.49732: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882194.49745: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882194.49761: variable 'omit' from source: magic vars 11579 1726882194.50537: variable 'ansible_distribution_major_version' from source: facts 11579 1726882194.50557: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882194.50608: variable 'omit' from source: magic vars 11579 1726882194.50636: variable 'omit' from source: magic vars 11579 1726882194.50855: variable 'controller_device' from source: play vars 11579 1726882194.50883: variable 'omit' from source: magic vars 11579 1726882194.51022: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882194.51067: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882194.51310: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882194.51313: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882194.51315: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882194.51317: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882194.51319: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882194.51321: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882194.51483: Set connection var ansible_timeout to 10 11579 1726882194.51712: Set connection var ansible_shell_type to sh 11579 1726882194.51715: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882194.51717: Set connection var ansible_shell_executable to /bin/sh 11579 1726882194.51719: Set connection var ansible_pipelining to False 11579 1726882194.51722: Set connection var ansible_connection to ssh 11579 1726882194.51724: variable 'ansible_shell_executable' from source: unknown 11579 1726882194.51726: variable 'ansible_connection' from source: unknown 11579 1726882194.51731: variable 'ansible_module_compression' from source: unknown 11579 1726882194.51733: variable 'ansible_shell_type' from source: unknown 11579 1726882194.51735: variable 'ansible_shell_executable' from source: unknown 11579 1726882194.51737: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882194.51854: variable 'ansible_pipelining' from source: unknown 11579 1726882194.51857: variable 'ansible_timeout' from source: unknown 11579 1726882194.51860: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882194.52059: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882194.52116: variable 'omit' from source: magic vars 11579 1726882194.52127: starting attempt loop 11579 1726882194.52135: running the handler 11579 1726882194.52203: _low_level_execute_command(): starting 11579 1726882194.52218: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882194.53918: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882194.53948: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882194.53972: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882194.54003: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882194.54102: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882194.55787: stdout chunk (state=3): >>>/root <<< 11579 1726882194.56012: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882194.56016: stdout chunk (state=3): >>><<< 11579 1726882194.56018: stderr chunk (state=3): >>><<< 11579 1726882194.56043: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882194.56237: _low_level_execute_command(): starting 11579 1726882194.56249: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882194.561448-12683-2476540867040 `" && echo ansible-tmp-1726882194.561448-12683-2476540867040="` echo /root/.ansible/tmp/ansible-tmp-1726882194.561448-12683-2476540867040 `" ) && sleep 0' 11579 1726882194.57315: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882194.57330: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882194.57348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882194.57370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882194.57387: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882194.57404: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882194.57419: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882194.57511: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882194.57700: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882194.57720: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882194.57742: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882194.57815: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882194.59788: stdout chunk (state=3): >>>ansible-tmp-1726882194.561448-12683-2476540867040=/root/.ansible/tmp/ansible-tmp-1726882194.561448-12683-2476540867040 <<< 11579 1726882194.59953: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882194.59959: stdout chunk (state=3): >>><<< 11579 1726882194.59961: stderr chunk (state=3): >>><<< 11579 1726882194.60188: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882194.561448-12683-2476540867040=/root/.ansible/tmp/ansible-tmp-1726882194.561448-12683-2476540867040 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882194.60191: variable 'ansible_module_compression' from source: unknown 11579 1726882194.60198: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11579 1726882194.60236: variable 'ansible_facts' from source: unknown 11579 1726882194.60433: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882194.561448-12683-2476540867040/AnsiballZ_command.py 11579 1726882194.60656: Sending initial data 11579 1726882194.60664: Sent initial data (153 bytes) 11579 1726882194.62567: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882194.62580: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882194.62612: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882194.62905: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882194.62909: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882194.63045: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882194.64547: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882194.64586: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882194.64634: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmp08adkbi4 /root/.ansible/tmp/ansible-tmp-1726882194.561448-12683-2476540867040/AnsiballZ_command.py <<< 11579 1726882194.64641: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882194.561448-12683-2476540867040/AnsiballZ_command.py" <<< 11579 1726882194.64674: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmp08adkbi4" to remote "/root/.ansible/tmp/ansible-tmp-1726882194.561448-12683-2476540867040/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882194.561448-12683-2476540867040/AnsiballZ_command.py" <<< 11579 1726882194.66114: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882194.66173: stderr chunk (state=3): >>><<< 11579 1726882194.66182: stdout chunk (state=3): >>><<< 11579 1726882194.66411: done transferring module to remote 11579 1726882194.66415: _low_level_execute_command(): starting 11579 1726882194.66417: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882194.561448-12683-2476540867040/ /root/.ansible/tmp/ansible-tmp-1726882194.561448-12683-2476540867040/AnsiballZ_command.py && sleep 0' 11579 1726882194.67641: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882194.67811: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882194.67843: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882194.68090: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882194.69959: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882194.69963: stdout chunk (state=3): >>><<< 11579 1726882194.69965: stderr chunk (state=3): >>><<< 11579 1726882194.69968: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882194.69970: _low_level_execute_command(): starting 11579 1726882194.69973: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882194.561448-12683-2476540867040/AnsiballZ_command.py && sleep 0' 11579 1726882194.71089: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882194.71264: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882194.71376: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882194.71419: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882194.71464: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882194.86702: stdout chunk (state=3): >>> {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::98/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::7483:d5ff:fecc:1d46/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::7483:d5ff:fecc:1d46/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:29:54.862007", "end": "2024-09-20 21:29:54.865432", "delta": "0:00:00.003425", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11579 1726882194.88452: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882194.88456: stdout chunk (state=3): >>><<< 11579 1726882194.88459: stderr chunk (state=3): >>><<< 11579 1726882194.88461: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000\n inet6 2001:db8::98/128 scope global dynamic noprefixroute \n valid_lft 236sec preferred_lft 236sec\n inet6 2001:db8::7483:d5ff:fecc:1d46/64 scope global dynamic noprefixroute \n valid_lft 1796sec preferred_lft 1796sec\n inet6 fe80::7483:d5ff:fecc:1d46/64 scope link noprefixroute \n valid_lft forever preferred_lft forever", "stderr": "", "rc": 0, "cmd": ["ip", "-6", "a", "s", "nm-bond"], "start": "2024-09-20 21:29:54.862007", "end": "2024-09-20 21:29:54.865432", "delta": "0:00:00.003425", "msg": "", "invocation": {"module_args": {"_raw_params": "ip -6 a s nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882194.88804: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip -6 a s nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882194.561448-12683-2476540867040/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882194.88809: _low_level_execute_command(): starting 11579 1726882194.88811: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882194.561448-12683-2476540867040/ > /dev/null 2>&1 && sleep 0' 11579 1726882194.90278: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882194.90321: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882194.90411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882194.90643: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882194.90663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882194.90728: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882194.92537: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882194.92545: stdout chunk (state=3): >>><<< 11579 1726882194.92554: stderr chunk (state=3): >>><<< 11579 1726882194.92610: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882194.92620: handler run complete 11579 1726882194.92999: Evaluated conditional (False): False 11579 1726882194.93169: variable 'result' from source: set_fact 11579 1726882194.93267: Evaluated conditional ('2001' in result.stdout): True 11579 1726882194.93314: attempt loop complete, returning result 11579 1726882194.93360: _execute() done 11579 1726882194.93367: dumping result to json 11579 1726882194.93608: done dumping result, returning 11579 1726882194.93611: done running TaskExecutor() for managed_node1/TASK: ** TEST check IPv6 [12673a56-9f93-f197-7423-000000000073] 11579 1726882194.93613: sending task result for task 12673a56-9f93-f197-7423-000000000073 ok: [managed_node1] => { "attempts": 1, "changed": false, "cmd": [ "ip", "-6", "a", "s", "nm-bond" ], "delta": "0:00:00.003425", "end": "2024-09-20 21:29:54.865432", "rc": 0, "start": "2024-09-20 21:29:54.862007" } STDOUT: 13: nm-bond: mtu 1500 qdisc noqueue state UP group default qlen 1000 inet6 2001:db8::98/128 scope global dynamic noprefixroute valid_lft 236sec preferred_lft 236sec inet6 2001:db8::7483:d5ff:fecc:1d46/64 scope global dynamic noprefixroute valid_lft 1796sec preferred_lft 1796sec inet6 fe80::7483:d5ff:fecc:1d46/64 scope link noprefixroute valid_lft forever preferred_lft forever 11579 1726882194.93791: no more pending results, returning what we have 11579 1726882194.93796: results queue empty 11579 1726882194.93797: checking for any_errors_fatal 11579 1726882194.93804: done checking for any_errors_fatal 11579 1726882194.93805: checking for max_fail_percentage 11579 1726882194.93807: done checking for max_fail_percentage 11579 1726882194.93807: checking to see if all hosts have failed and the running result is not ok 11579 1726882194.93808: done checking to see if all hosts have failed 11579 1726882194.93809: getting the remaining hosts for this loop 11579 1726882194.93811: done getting the remaining hosts for this loop 11579 1726882194.93813: getting the next task for host managed_node1 11579 1726882194.93824: done getting next task for host managed_node1 11579 1726882194.93829: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11579 1726882194.93832: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882194.93853: getting variables 11579 1726882194.93855: in VariableManager get_vars() 11579 1726882194.94081: Calling all_inventory to load vars for managed_node1 11579 1726882194.94084: Calling groups_inventory to load vars for managed_node1 11579 1726882194.94086: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882194.94100: Calling all_plugins_play to load vars for managed_node1 11579 1726882194.94104: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882194.94108: Calling groups_plugins_play to load vars for managed_node1 11579 1726882194.94699: done sending task result for task 12673a56-9f93-f197-7423-000000000073 11579 1726882194.94703: WORKER PROCESS EXITING 11579 1726882194.97074: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882194.99984: done with get_vars() 11579 1726882195.00016: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:4 Friday 20 September 2024 21:29:55 -0400 (0:00:00.526) 0:00:23.710 ****** 11579 1726882195.00172: entering _queue_task() for managed_node1/include_tasks 11579 1726882195.00562: worker is 1 (out of 1 available) 11579 1726882195.00575: exiting _queue_task() for managed_node1/include_tasks 11579 1726882195.00587: done queuing things up, now waiting for results queue to drain 11579 1726882195.00588: waiting for pending results... 11579 1726882195.00886: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role 11579 1726882195.01066: in run() - task 12673a56-9f93-f197-7423-00000000007c 11579 1726882195.01098: variable 'ansible_search_path' from source: unknown 11579 1726882195.01108: variable 'ansible_search_path' from source: unknown 11579 1726882195.01157: calling self._execute() 11579 1726882195.01270: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882195.01284: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882195.01304: variable 'omit' from source: magic vars 11579 1726882195.01691: variable 'ansible_distribution_major_version' from source: facts 11579 1726882195.01713: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882195.01724: _execute() done 11579 1726882195.01732: dumping result to json 11579 1726882195.01739: done dumping result, returning 11579 1726882195.01799: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role [12673a56-9f93-f197-7423-00000000007c] 11579 1726882195.01802: sending task result for task 12673a56-9f93-f197-7423-00000000007c 11579 1726882195.01871: done sending task result for task 12673a56-9f93-f197-7423-00000000007c 11579 1726882195.01874: WORKER PROCESS EXITING 11579 1726882195.01943: no more pending results, returning what we have 11579 1726882195.01948: in VariableManager get_vars() 11579 1726882195.02006: Calling all_inventory to load vars for managed_node1 11579 1726882195.02010: Calling groups_inventory to load vars for managed_node1 11579 1726882195.02013: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882195.02029: Calling all_plugins_play to load vars for managed_node1 11579 1726882195.02033: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882195.02036: Calling groups_plugins_play to load vars for managed_node1 11579 1726882195.05781: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882195.09603: done with get_vars() 11579 1726882195.09634: variable 'ansible_search_path' from source: unknown 11579 1726882195.09635: variable 'ansible_search_path' from source: unknown 11579 1726882195.09678: we have included files to process 11579 1726882195.09679: generating all_blocks data 11579 1726882195.09681: done generating all_blocks data 11579 1726882195.09686: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11579 1726882195.09687: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11579 1726882195.09690: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml 11579 1726882195.11625: done processing included file 11579 1726882195.11629: iterating over new_blocks loaded from include file 11579 1726882195.11631: in VariableManager get_vars() 11579 1726882195.11669: done with get_vars() 11579 1726882195.11673: filtering new block on tags 11579 1726882195.11827: done filtering new block on tags 11579 1726882195.11831: in VariableManager get_vars() 11579 1726882195.11863: done with get_vars() 11579 1726882195.11865: filtering new block on tags 11579 1726882195.12057: done filtering new block on tags 11579 1726882195.12061: in VariableManager get_vars() 11579 1726882195.12084: done with get_vars() 11579 1726882195.12086: filtering new block on tags 11579 1726882195.12197: done filtering new block on tags 11579 1726882195.12200: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml for managed_node1 11579 1726882195.12205: extending task lists for all hosts with included blocks 11579 1726882195.14937: done extending task lists 11579 1726882195.14939: done processing included files 11579 1726882195.14940: results queue empty 11579 1726882195.14941: checking for any_errors_fatal 11579 1726882195.14945: done checking for any_errors_fatal 11579 1726882195.14946: checking for max_fail_percentage 11579 1726882195.14947: done checking for max_fail_percentage 11579 1726882195.14948: checking to see if all hosts have failed and the running result is not ok 11579 1726882195.14949: done checking to see if all hosts have failed 11579 1726882195.14949: getting the remaining hosts for this loop 11579 1726882195.14951: done getting the remaining hosts for this loop 11579 1726882195.14953: getting the next task for host managed_node1 11579 1726882195.14958: done getting next task for host managed_node1 11579 1726882195.14961: ^ task is: TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11579 1726882195.14965: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882195.14976: getting variables 11579 1726882195.14977: in VariableManager get_vars() 11579 1726882195.15011: Calling all_inventory to load vars for managed_node1 11579 1726882195.15014: Calling groups_inventory to load vars for managed_node1 11579 1726882195.15016: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882195.15023: Calling all_plugins_play to load vars for managed_node1 11579 1726882195.15025: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882195.15028: Calling groups_plugins_play to load vars for managed_node1 11579 1726882195.16698: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882195.19296: done with get_vars() 11579 1726882195.19320: done getting variables TASK [fedora.linux_system_roles.network : Ensure ansible_facts used by role are present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:3 Friday 20 September 2024 21:29:55 -0400 (0:00:00.192) 0:00:23.902 ****** 11579 1726882195.19447: entering _queue_task() for managed_node1/setup 11579 1726882195.20245: worker is 1 (out of 1 available) 11579 1726882195.20260: exiting _queue_task() for managed_node1/setup 11579 1726882195.20271: done queuing things up, now waiting for results queue to drain 11579 1726882195.20272: waiting for pending results... 11579 1726882195.20625: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present 11579 1726882195.20938: in run() - task 12673a56-9f93-f197-7423-000000000491 11579 1726882195.20943: variable 'ansible_search_path' from source: unknown 11579 1726882195.20946: variable 'ansible_search_path' from source: unknown 11579 1726882195.20955: calling self._execute() 11579 1726882195.21062: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882195.21073: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882195.21090: variable 'omit' from source: magic vars 11579 1726882195.21507: variable 'ansible_distribution_major_version' from source: facts 11579 1726882195.21524: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882195.21828: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882195.25026: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882195.25179: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882195.25246: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882195.25379: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882195.25383: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882195.25438: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882195.25471: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882195.25544: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882195.25592: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882195.25658: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882195.25760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882195.25785: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882195.25825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882195.25899: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882195.25902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882195.26055: variable '__network_required_facts' from source: role '' defaults 11579 1726882195.26069: variable 'ansible_facts' from source: unknown 11579 1726882195.26923: Evaluated conditional (__network_required_facts | difference(ansible_facts.keys() | list) | length > 0): False 11579 1726882195.26931: when evaluation is False, skipping this task 11579 1726882195.26938: _execute() done 11579 1726882195.27006: dumping result to json 11579 1726882195.27014: done dumping result, returning 11579 1726882195.27016: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure ansible_facts used by role are present [12673a56-9f93-f197-7423-000000000491] 11579 1726882195.27019: sending task result for task 12673a56-9f93-f197-7423-000000000491 skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11579 1726882195.27269: no more pending results, returning what we have 11579 1726882195.27274: results queue empty 11579 1726882195.27275: checking for any_errors_fatal 11579 1726882195.27276: done checking for any_errors_fatal 11579 1726882195.27277: checking for max_fail_percentage 11579 1726882195.27279: done checking for max_fail_percentage 11579 1726882195.27280: checking to see if all hosts have failed and the running result is not ok 11579 1726882195.27281: done checking to see if all hosts have failed 11579 1726882195.27281: getting the remaining hosts for this loop 11579 1726882195.27283: done getting the remaining hosts for this loop 11579 1726882195.27286: getting the next task for host managed_node1 11579 1726882195.27302: done getting next task for host managed_node1 11579 1726882195.27339: ^ task is: TASK: fedora.linux_system_roles.network : Check if system is ostree 11579 1726882195.27345: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882195.27363: getting variables 11579 1726882195.27365: in VariableManager get_vars() 11579 1726882195.27656: Calling all_inventory to load vars for managed_node1 11579 1726882195.27659: Calling groups_inventory to load vars for managed_node1 11579 1726882195.27662: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882195.27673: Calling all_plugins_play to load vars for managed_node1 11579 1726882195.27675: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882195.27677: Calling groups_plugins_play to load vars for managed_node1 11579 1726882195.28322: done sending task result for task 12673a56-9f93-f197-7423-000000000491 11579 1726882195.28326: WORKER PROCESS EXITING 11579 1726882195.30100: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882195.33854: done with get_vars() 11579 1726882195.33885: done getting variables TASK [fedora.linux_system_roles.network : Check if system is ostree] *********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:12 Friday 20 September 2024 21:29:55 -0400 (0:00:00.146) 0:00:24.049 ****** 11579 1726882195.34141: entering _queue_task() for managed_node1/stat 11579 1726882195.35001: worker is 1 (out of 1 available) 11579 1726882195.35017: exiting _queue_task() for managed_node1/stat 11579 1726882195.35028: done queuing things up, now waiting for results queue to drain 11579 1726882195.35029: waiting for pending results... 11579 1726882195.35416: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree 11579 1726882195.35887: in run() - task 12673a56-9f93-f197-7423-000000000493 11579 1726882195.35996: variable 'ansible_search_path' from source: unknown 11579 1726882195.36000: variable 'ansible_search_path' from source: unknown 11579 1726882195.36003: calling self._execute() 11579 1726882195.36064: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882195.36217: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882195.36234: variable 'omit' from source: magic vars 11579 1726882195.36914: variable 'ansible_distribution_major_version' from source: facts 11579 1726882195.36934: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882195.37288: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882195.38041: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882195.38273: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882195.38277: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882195.38280: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882195.38491: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882195.38623: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882195.38650: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882195.38677: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882195.38784: variable '__network_is_ostree' from source: set_fact 11579 1726882195.38933: Evaluated conditional (not __network_is_ostree is defined): False 11579 1726882195.39032: when evaluation is False, skipping this task 11579 1726882195.39036: _execute() done 11579 1726882195.39039: dumping result to json 11579 1726882195.39041: done dumping result, returning 11579 1726882195.39044: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if system is ostree [12673a56-9f93-f197-7423-000000000493] 11579 1726882195.39046: sending task result for task 12673a56-9f93-f197-7423-000000000493 11579 1726882195.39116: done sending task result for task 12673a56-9f93-f197-7423-000000000493 11579 1726882195.39119: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11579 1726882195.39177: no more pending results, returning what we have 11579 1726882195.39181: results queue empty 11579 1726882195.39182: checking for any_errors_fatal 11579 1726882195.39190: done checking for any_errors_fatal 11579 1726882195.39191: checking for max_fail_percentage 11579 1726882195.39197: done checking for max_fail_percentage 11579 1726882195.39198: checking to see if all hosts have failed and the running result is not ok 11579 1726882195.39199: done checking to see if all hosts have failed 11579 1726882195.39200: getting the remaining hosts for this loop 11579 1726882195.39202: done getting the remaining hosts for this loop 11579 1726882195.39205: getting the next task for host managed_node1 11579 1726882195.39213: done getting next task for host managed_node1 11579 1726882195.39217: ^ task is: TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11579 1726882195.39223: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882195.39241: getting variables 11579 1726882195.39242: in VariableManager get_vars() 11579 1726882195.39283: Calling all_inventory to load vars for managed_node1 11579 1726882195.39285: Calling groups_inventory to load vars for managed_node1 11579 1726882195.39287: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882195.39304: Calling all_plugins_play to load vars for managed_node1 11579 1726882195.39307: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882195.39310: Calling groups_plugins_play to load vars for managed_node1 11579 1726882195.41819: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882195.43615: done with get_vars() 11579 1726882195.43650: done getting variables 11579 1726882195.43737: Loading ActionModule 'set_fact' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/set_fact.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Set flag to indicate system is ostree] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:17 Friday 20 September 2024 21:29:55 -0400 (0:00:00.096) 0:00:24.146 ****** 11579 1726882195.43802: entering _queue_task() for managed_node1/set_fact 11579 1726882195.44224: worker is 1 (out of 1 available) 11579 1726882195.44237: exiting _queue_task() for managed_node1/set_fact 11579 1726882195.44361: done queuing things up, now waiting for results queue to drain 11579 1726882195.44363: waiting for pending results... 11579 1726882195.44601: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree 11579 1726882195.44784: in run() - task 12673a56-9f93-f197-7423-000000000494 11579 1726882195.44820: variable 'ansible_search_path' from source: unknown 11579 1726882195.44830: variable 'ansible_search_path' from source: unknown 11579 1726882195.44872: calling self._execute() 11579 1726882195.44991: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882195.45022: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882195.45053: variable 'omit' from source: magic vars 11579 1726882195.45599: variable 'ansible_distribution_major_version' from source: facts 11579 1726882195.45603: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882195.45792: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882195.46083: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882195.46140: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882195.46178: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882195.46224: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882195.46317: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882195.46429: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882195.46433: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882195.46436: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882195.46504: variable '__network_is_ostree' from source: set_fact 11579 1726882195.46517: Evaluated conditional (not __network_is_ostree is defined): False 11579 1726882195.46525: when evaluation is False, skipping this task 11579 1726882195.46540: _execute() done 11579 1726882195.46551: dumping result to json 11579 1726882195.46559: done dumping result, returning 11579 1726882195.46571: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Set flag to indicate system is ostree [12673a56-9f93-f197-7423-000000000494] 11579 1726882195.46581: sending task result for task 12673a56-9f93-f197-7423-000000000494 11579 1726882195.46724: done sending task result for task 12673a56-9f93-f197-7423-000000000494 11579 1726882195.46727: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not __network_is_ostree is defined", "skip_reason": "Conditional result was False" } 11579 1726882195.46807: no more pending results, returning what we have 11579 1726882195.46810: results queue empty 11579 1726882195.46811: checking for any_errors_fatal 11579 1726882195.46817: done checking for any_errors_fatal 11579 1726882195.46818: checking for max_fail_percentage 11579 1726882195.46819: done checking for max_fail_percentage 11579 1726882195.46821: checking to see if all hosts have failed and the running result is not ok 11579 1726882195.46822: done checking to see if all hosts have failed 11579 1726882195.46823: getting the remaining hosts for this loop 11579 1726882195.46824: done getting the remaining hosts for this loop 11579 1726882195.46827: getting the next task for host managed_node1 11579 1726882195.46839: done getting next task for host managed_node1 11579 1726882195.46843: ^ task is: TASK: fedora.linux_system_roles.network : Check which services are running 11579 1726882195.46848: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882195.46871: getting variables 11579 1726882195.46873: in VariableManager get_vars() 11579 1726882195.46910: Calling all_inventory to load vars for managed_node1 11579 1726882195.46913: Calling groups_inventory to load vars for managed_node1 11579 1726882195.46915: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882195.46924: Calling all_plugins_play to load vars for managed_node1 11579 1726882195.46926: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882195.46928: Calling groups_plugins_play to load vars for managed_node1 11579 1726882195.48196: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882195.50290: done with get_vars() 11579 1726882195.50316: done getting variables TASK [fedora.linux_system_roles.network : Check which services are running] **** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Friday 20 September 2024 21:29:55 -0400 (0:00:00.066) 0:00:24.212 ****** 11579 1726882195.50420: entering _queue_task() for managed_node1/service_facts 11579 1726882195.50748: worker is 1 (out of 1 available) 11579 1726882195.50761: exiting _queue_task() for managed_node1/service_facts 11579 1726882195.50776: done queuing things up, now waiting for results queue to drain 11579 1726882195.50778: waiting for pending results... 11579 1726882195.51248: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running 11579 1726882195.51254: in run() - task 12673a56-9f93-f197-7423-000000000496 11579 1726882195.51258: variable 'ansible_search_path' from source: unknown 11579 1726882195.51260: variable 'ansible_search_path' from source: unknown 11579 1726882195.51288: calling self._execute() 11579 1726882195.51382: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882195.51386: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882195.51404: variable 'omit' from source: magic vars 11579 1726882195.52154: variable 'ansible_distribution_major_version' from source: facts 11579 1726882195.52167: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882195.52173: variable 'omit' from source: magic vars 11579 1726882195.52260: variable 'omit' from source: magic vars 11579 1726882195.52296: variable 'omit' from source: magic vars 11579 1726882195.52342: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882195.52502: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882195.52506: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882195.52508: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882195.52511: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882195.52513: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882195.52516: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882195.52518: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882195.52578: Set connection var ansible_timeout to 10 11579 1726882195.52652: Set connection var ansible_shell_type to sh 11579 1726882195.52656: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882195.52662: Set connection var ansible_shell_executable to /bin/sh 11579 1726882195.52670: Set connection var ansible_pipelining to False 11579 1726882195.52672: Set connection var ansible_connection to ssh 11579 1726882195.52698: variable 'ansible_shell_executable' from source: unknown 11579 1726882195.52701: variable 'ansible_connection' from source: unknown 11579 1726882195.52774: variable 'ansible_module_compression' from source: unknown 11579 1726882195.52777: variable 'ansible_shell_type' from source: unknown 11579 1726882195.52780: variable 'ansible_shell_executable' from source: unknown 11579 1726882195.52782: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882195.52784: variable 'ansible_pipelining' from source: unknown 11579 1726882195.52786: variable 'ansible_timeout' from source: unknown 11579 1726882195.52788: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882195.53085: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11579 1726882195.53090: variable 'omit' from source: magic vars 11579 1726882195.53092: starting attempt loop 11579 1726882195.53097: running the handler 11579 1726882195.53099: _low_level_execute_command(): starting 11579 1726882195.53101: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882195.53726: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882195.53739: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882195.53750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882195.53837: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882195.53842: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882195.53845: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882195.53848: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882195.53854: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882195.54001: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882195.54004: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882195.54024: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882195.55665: stdout chunk (state=3): >>>/root <<< 11579 1726882195.55843: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882195.55848: stdout chunk (state=3): >>><<< 11579 1726882195.55859: stderr chunk (state=3): >>><<< 11579 1726882195.55918: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882195.55998: _low_level_execute_command(): starting 11579 1726882195.56003: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882195.5591476-12724-100009215206979 `" && echo ansible-tmp-1726882195.5591476-12724-100009215206979="` echo /root/.ansible/tmp/ansible-tmp-1726882195.5591476-12724-100009215206979 `" ) && sleep 0' 11579 1726882195.57213: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882195.57399: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882195.57402: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882195.57405: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882195.57445: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882195.59311: stdout chunk (state=3): >>>ansible-tmp-1726882195.5591476-12724-100009215206979=/root/.ansible/tmp/ansible-tmp-1726882195.5591476-12724-100009215206979 <<< 11579 1726882195.59598: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882195.59602: stdout chunk (state=3): >>><<< 11579 1726882195.59605: stderr chunk (state=3): >>><<< 11579 1726882195.59608: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882195.5591476-12724-100009215206979=/root/.ansible/tmp/ansible-tmp-1726882195.5591476-12724-100009215206979 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882195.59610: variable 'ansible_module_compression' from source: unknown 11579 1726882195.59612: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.service_facts-ZIP_DEFLATED 11579 1726882195.59614: variable 'ansible_facts' from source: unknown 11579 1726882195.59702: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882195.5591476-12724-100009215206979/AnsiballZ_service_facts.py 11579 1726882195.60072: Sending initial data 11579 1726882195.60079: Sent initial data (162 bytes) 11579 1726882195.60703: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882195.60706: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882195.60709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882195.60713: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882195.60715: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882195.60718: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882195.60916: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882195.60987: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882195.62479: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11579 1726882195.62490: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882195.62536: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882195.62578: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmph8pcmfsy /root/.ansible/tmp/ansible-tmp-1726882195.5591476-12724-100009215206979/AnsiballZ_service_facts.py <<< 11579 1726882195.62582: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882195.5591476-12724-100009215206979/AnsiballZ_service_facts.py" <<< 11579 1726882195.62619: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmph8pcmfsy" to remote "/root/.ansible/tmp/ansible-tmp-1726882195.5591476-12724-100009215206979/AnsiballZ_service_facts.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882195.5591476-12724-100009215206979/AnsiballZ_service_facts.py" <<< 11579 1726882195.63150: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882195.63201: stderr chunk (state=3): >>><<< 11579 1726882195.63205: stdout chunk (state=3): >>><<< 11579 1726882195.63207: done transferring module to remote 11579 1726882195.63209: _low_level_execute_command(): starting 11579 1726882195.63211: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882195.5591476-12724-100009215206979/ /root/.ansible/tmp/ansible-tmp-1726882195.5591476-12724-100009215206979/AnsiballZ_service_facts.py && sleep 0' 11579 1726882195.63647: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882195.63707: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882195.63719: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882195.63754: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882195.63823: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882195.65558: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882195.65563: stdout chunk (state=3): >>><<< 11579 1726882195.65566: stderr chunk (state=3): >>><<< 11579 1726882195.65579: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882195.65582: _low_level_execute_command(): starting 11579 1726882195.65586: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882195.5591476-12724-100009215206979/AnsiballZ_service_facts.py && sleep 0' 11579 1726882195.65963: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882195.65992: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882195.65999: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 11579 1726882195.66001: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882195.66005: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882195.66008: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882195.66055: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882195.66059: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882195.66109: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882197.16102: stdout chunk (state=3): >>> {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "st<<< 11579 1726882197.16110: stdout chunk (state=3): >>>opped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "st<<< 11579 1726882197.16121: stdout chunk (state=3): >>>opped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "st<<< 11579 1726882197.16125: stdout chunk (state=3): >>>atic", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} <<< 11579 1726882197.17715: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882197.17720: stdout chunk (state=3): >>><<< 11579 1726882197.17723: stderr chunk (state=3): >>><<< 11579 1726882197.17902: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"services": {"audit-rules.service": {"name": "audit-rules.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "auditd.service": {"name": "auditd.service", "state": "running", "status": "enabled", "source": "systemd"}, "auth-rpcgss-module.service": {"name": "auth-rpcgss-module.service", "state": "stopped", "status": "static", "source": "systemd"}, "autofs.service": {"name": "autofs.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "chronyd.service": {"name": "chronyd.service", "state": "running", "status": "enabled", "source": "systemd"}, "cloud-config.service": {"name": "cloud-config.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-final.service": {"name": "cloud-final.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init-local.service": {"name": "cloud-init-local.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "cloud-init.service": {"name": "cloud-init.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "crond.service": {"name": "crond.service", "state": "running", "status": "enabled", "source": "systemd"}, "dbus-broker.service": {"name": "dbus-broker.service", "state": "running", "status": "enabled", "source": "systemd"}, "display-manager.service": {"name": "display-manager.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "dm-event.service": {"name": "dm-event.service", "state": "stopped", "status": "static", "source": "systemd"}, "dnf-makecache.service": {"name": "dnf-makecache.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-cmdline.service": {"name": "dracut-cmdline.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-initqueue.service": {"name": "dracut-initqueue.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-mount.service": {"name": "dracut-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-mount.service": {"name": "dracut-pre-mount.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-pivot.service": {"name": "dracut-pre-pivot.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-trigger.service": {"name": "dracut-pre-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-pre-udev.service": {"name": "dracut-pre-udev.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown-onfailure.service": {"name": "dracut-shutdown-onfailure.service", "state": "stopped", "status": "static", "source": "systemd"}, "dracut-shutdown.service": {"name": "dracut-shutdown.service", "state": "stopped", "status": "static", "source": "systemd"}, "emergency.service": {"name": "emergency.service", "state": "stopped", "status": "static", "source": "systemd"}, "fstrim.service": {"name": "fstrim.service", "state": "stopped", "status": "static", "source": "systemd"}, "getty@tty1.service": {"name": "getty@tty1.service", "state": "running", "status": "active", "source": "systemd"}, "gssproxy.service": {"name": "gssproxy.service", "state": "running", "status": "disabled", "source": "systemd"}, "hv_kvp_daemon.service": {"name": "hv_kvp_daemon.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "initrd-cleanup.service": {"name": "initrd-cleanup.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-parse-etc.service": {"name": "initrd-parse-etc.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-switch-root.service": {"name": "initrd-switch-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "initrd-udevadm-cleanup-db.service": {"name": "initrd-udevadm-cleanup-db.service", "state": "stopped", "status": "static", "source": "systemd"}, "irqbalance.service": {"name": "irqbalance.service", "state": "running", "status": "enabled", "source": "systemd"}, "kdump.service": {"name": "kdump.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "kmod-static-nodes.service": {"name": "kmod-static-nodes.service", "state": "stopped", "status": "static", "source": "systemd"}, "ldconfig.service": {"name": "ldconfig.service", "state": "stopped", "status": "static", "source": "systemd"}, "logrotate.service": {"name": "logrotate.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-lvmpolld.service": {"name": "lvm2-lvmpolld.service", "state": "stopped", "status": "static", "source": "systemd"}, "lvm2-monitor.service": {"name": "lvm2-monitor.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "modprobe@configfs.service": {"name": "modprobe@configfs.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@dm_mod.service": {"name": "modprobe@dm_mod.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@drm.service": {"name": "modprobe@drm.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@efi_pstore.service": {"name": "modprobe@efi_pstore.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@fuse.service": {"name": "modprobe@fuse.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "modprobe@loop.service": {"name": "modprobe@loop.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "network.service": {"name": "network.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "NetworkManager-dispatcher.service": {"name": "NetworkManager-dispatcher.service", "state": "running", "status": "enabled", "source": "systemd"}, "NetworkManager-wait-online.service": {"name": "NetworkManager-wait-online.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "NetworkManager.service": {"name": "NetworkManager.service", "state": "running", "status": "enabled", "source": "systemd"}, "nfs-idmapd.service": {"name": "nfs-idmapd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-mountd.service": {"name": "nfs-mountd.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfs-server.service": {"name": "nfs-server.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "nfs-utils.service": {"name": "nfs-utils.service", "state": "stopped", "status": "static", "source": "systemd"}, "nfsdcld.service": {"name": "nfsdcld.service", "state": "stopped", "status": "static", "source": "systemd"}, "ntpd.service": {"name": "ntpd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ntpdate.service": {"name": "ntpdate.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "pcscd.service": {"name": "pcscd.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "plymouth-quit-wait.service": {"name": "plymouth-quit-wait.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "plymouth-start.service": {"name": "plymouth-start.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rc-local.service": {"name": "rc-local.service", "state": "stopped", "status": "static", "source": "systemd"}, "rescue.service": {"name": "rescue.service", "state": "stopped", "status": "static", "source": "systemd"}, "restraintd.service": {"name": "restraintd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rngd.service": {"name": "rngd.service", "state": "running", "status": "enabled", "source": "systemd"}, "rpc-gssd.service": {"name": "rpc-gssd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd-notify.service": {"name": "rpc-statd-notify.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-statd.service": {"name": "rpc-statd.service", "state": "stopped", "status": "static", "source": "systemd"}, "rpc-svcgssd.service": {"name": "rpc-svcgssd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "rpcbind.service": {"name": "rpcbind.service", "state": "running", "status": "enabled", "source": "systemd"}, "rsyslog.service": {"name": "rsyslog.service", "state": "running", "status": "enabled", "source": "systemd"}, "selinux-autorelabel-mark.service": {"name": "selinux-autorelabel-mark.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "serial-getty@ttyS0.service": {"name": "serial-getty@ttyS0.service", "state": "running", "status": "active", "source": "systemd"}, "sntp.service": {"name": "sntp.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "ssh-host-keys-migration.service": {"name": "ssh-host-keys-migration.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "sshd-keygen.service": {"name": "sshd-keygen.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "sshd-keygen@ecdsa.service": {"name": "sshd-keygen@ecdsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@ed25519.service": {"name": "sshd-keygen@ed25519.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd-keygen@rsa.service": {"name": "sshd-keygen@rsa.service", "state": "stopped", "status": "inactive", "source": "systemd"}, "sshd.service": {"name": "sshd.service", "state": "running", "status": "enabled", "source": "systemd"}, "sssd-kcm.service": {"name": "sssd-kcm.service", "state": "stopped", "status": "indirect", "source": "systemd"}, "sssd.service": {"name": "sssd.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "syslog.service": {"name": "syslog.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-ask-password-console.service": {"name": "systemd-ask-password-console.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-ask-password-wall.service": {"name": "systemd-ask-password-wall.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-battery-check.service": {"name": "systemd-battery-check.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-binfmt.service": {"name": "systemd-binfmt.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-boot-random-seed.service": {"name": "systemd-boot-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-confext.service": {"name": "systemd-confext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-firstboot.service": {"name": "systemd-firstboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-fsck-root.service": {"name": "systemd-fsck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-clear.service": {"name": "systemd-hibernate-clear.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hibernate-resume.service": {"name": "systemd-hibernate-resume.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hostnamed.service": {"name": "systemd-hostnamed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-hwdb-update.service": {"name": "systemd-hwdb-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-initctl.service": {"name": "systemd-initctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-catalog-update.service": {"name": "systemd-journal-catalog-update.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journal-flush.service": {"name": "systemd-journal-flush.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-journald.service": {"name": "systemd-journald.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-logind.service": {"name": "systemd-logind.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-machine-id-commit.service": {"name": "systemd-machine-id-commit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-modules-load.service": {"name": "systemd-modules-load.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-network-generator.service": {"name": "systemd-network-generator.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-networkd-wait-online.service": {"name": "systemd-networkd-wait-online.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-oomd.service": {"name": "systemd-oomd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-pcrmachine.service": {"name": "systemd-pcrmachine.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-initrd.service": {"name": "systemd-pcrphase-initrd.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase-sysinit.service": {"name": "systemd-pcrphase-sysinit.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pcrphase.service": {"name": "systemd-pcrphase.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-pstore.service": {"name": "systemd-pstore.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-quotacheck-root.service": {"name": "systemd-quotacheck-root.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-random-seed.service": {"name": "systemd-random-seed.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-remount-fs.service": {"name": "systemd-remount-fs.service", "state": "stopped", "status": "enabled-runtime", "source": "systemd"}, "systemd-repart.service": {"name": "systemd-repart.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-rfkill.service": {"name": "systemd-rfkill.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-soft-reboot.service": {"name": "systemd-soft-reboot.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysctl.service": {"name": "systemd-sysctl.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-sysext.service": {"name": "systemd-sysext.service", "state": "stopped", "status": "enabled", "source": "systemd"}, "systemd-sysusers.service": {"name": "systemd-sysusers.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-timesyncd.service": {"name": "systemd-timesyncd.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "systemd-tmpfiles-clean.service": {"name": "systemd-tmpfiles-clean.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev-early.service": {"name": "systemd-tmpfiles-setup-dev-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup-dev.service": {"name": "systemd-tmpfiles-setup-dev.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tmpfiles-setup.service": {"name": "systemd-tmpfiles-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup-early.service": {"name": "systemd-tpm2-setup-early.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-tpm2-setup.service": {"name": "systemd-tpm2-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-load-credentials.service": {"name": "systemd-udev-load-credentials.service", "state": "stopped", "status": "disabled", "source": "systemd"}, "systemd-udev-settle.service": {"name": "systemd-udev-settle.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udev-trigger.service": {"name": "systemd-udev-trigger.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-udevd.service": {"name": "systemd-udevd.service", "state": "running", "status": "static", "source": "systemd"}, "systemd-update-done.service": {"name": "systemd-update-done.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp-runlevel.service": {"name": "systemd-update-utmp-runlevel.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-update-utmp.service": {"name": "systemd-update-utmp.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-user-sessions.service": {"name": "systemd-user-sessions.service", "state": "stopped", "status": "static", "source": "systemd"}, "systemd-vconsole-setup.service": {"name": "systemd-vconsole-setup.service", "state": "stopped", "status": "static", "source": "systemd"}, "user-runtime-dir@0.service": {"name": "user-runtime-dir@0.service", "state": "stopped", "status": "active", "source": "systemd"}, "user@0.service": {"name": "user@0.service", "state": "running", "status": "active", "source": "systemd"}, "wpa_supplicant.service": {"name": "wpa_supplicant.service", "state": "running", "status": "enabled", "source": "systemd"}, "ypbind.service": {"name": "ypbind.service", "state": "stopped", "status": "not-found", "source": "systemd"}, "autovt@.service": {"name": "autovt@.service", "state": "unknown", "status": "alias", "source": "systemd"}, "blk-availability.service": {"name": "blk-availability.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "capsule@.service": {"name": "capsule@.service", "state": "unknown", "status": "static", "source": "systemd"}, "chrony-wait.service": {"name": "chrony-wait.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "chronyd-restricted.service": {"name": "chronyd-restricted.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "cloud-init-hotplugd.service": {"name": "cloud-init-hotplugd.service", "state": "inactive", "status": "static", "source": "systemd"}, "console-getty.service": {"name": "console-getty.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "container-getty@.service": {"name": "container-getty@.service", "state": "unknown", "status": "static", "source": "systemd"}, "dbus-org.freedesktop.hostname1.service": {"name": "dbus-org.freedesktop.hostname1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.locale1.service": {"name": "dbus-org.freedesktop.locale1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.login1.service": {"name": "dbus-org.freedesktop.login1.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.nm-dispatcher.service": {"name": "dbus-org.freedesktop.nm-dispatcher.service", "state": "active", "status": "alias", "source": "systemd"}, "dbus-org.freedesktop.timedate1.service": {"name": "dbus-org.freedesktop.timedate1.service", "state": "inactive", "status": "alias", "source": "systemd"}, "dbus.service": {"name": "dbus.service", "state": "active", "status": "alias", "source": "systemd"}, "debug-shell.service": {"name": "debug-shell.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd.service": {"name": "dhcpcd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dhcpcd@.service": {"name": "dhcpcd@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "dnf-system-upgrade-cleanup.service": {"name": "dnf-system-upgrade-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "dnf-system-upgrade.service": {"name": "dnf-system-upgrade.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "dnsmasq.service": {"name": "dnsmasq.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fips-crypto-policy-overlay.service": {"name": "fips-crypto-policy-overlay.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "firewalld.service": {"name": "firewalld.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "fsidd.service": {"name": "fsidd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "getty@.service": {"name": "getty@.service", "state": "unknown", "status": "enabled", "source": "systemd"}, "grub-boot-indeterminate.service": {"name": "grub-boot-indeterminate.service", "state": "inactive", "status": "static", "source": "systemd"}, "grub2-systemd-integration.service": {"name": "grub2-systemd-integration.service", "state": "inactive", "status": "static", "source": "systemd"}, "hostapd.service": {"name": "hostapd.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "kvm_stat.service": {"name": "kvm_stat.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "lvm-devices-import.service": {"name": "lvm-devices-import.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "man-db-cache-update.service": {"name": "man-db-cache-update.service", "state": "inactive", "status": "static", "source": "systemd"}, "man-db-restart-cache-update.service": {"name": "man-db-restart-cache-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "microcode.service": {"name": "microcode.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "modprobe@.service": {"name": "modprobe@.service", "state": "unknown", "status": "static", "source": "systemd"}, "nfs-blkmap.service": {"name": "nfs-blkmap.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nftables.service": {"name": "nftables.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nis-domainname.service": {"name": "nis-domainname.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "nm-priv-helper.service": {"name": "nm-priv-helper.service", "state": "inactive", "status": "static", "source": "systemd"}, "pam_namespace.service": {"name": "pam_namespace.service", "state": "inactive", "status": "static", "source": "systemd"}, "polkit.service": {"name": "polkit.service", "state": "inactive", "status": "static", "source": "systemd"}, "qemu-guest-agent.service": {"name": "qemu-guest-agent.service", "state": "inactive", "status": "enabled", "source": "systemd"}, "quotaon-root.service": {"name": "quotaon-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "quotaon@.service": {"name": "quotaon@.service", "state": "unknown", "status": "static", "source": "systemd"}, "rpmdb-migrate.service": {"name": "rpmdb-migrate.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "rpmdb-rebuild.service": {"name": "rpmdb-rebuild.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "selinux-autorelabel.service": {"name": "selinux-autorelabel.service", "state": "inactive", "status": "static", "source": "systemd"}, "selinux-check-proper-disable.service": {"name": "selinux-check-proper-disable.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "serial-getty@.service": {"name": "serial-getty@.service", "state": "unknown", "status": "indirect", "source": "systemd"}, "sshd-keygen@.service": {"name": "sshd-keygen@.service", "state": "unknown", "status": "disabled", "source": "systemd"}, "sshd@.service": {"name": "sshd@.service", "state": "unknown", "status": "static", "source": "systemd"}, "sssd-autofs.service": {"name": "sssd-autofs.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-nss.service": {"name": "sssd-nss.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pac.service": {"name": "sssd-pac.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-pam.service": {"name": "sssd-pam.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-ssh.service": {"name": "sssd-ssh.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "sssd-sudo.service": {"name": "sssd-sudo.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "system-update-cleanup.service": {"name": "system-update-cleanup.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-backlight@.service": {"name": "systemd-backlight@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-bless-boot.service": {"name": "systemd-bless-boot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-boot-check-no-failures.service": {"name": "systemd-boot-check-no-failures.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-boot-update.service": {"name": "systemd-boot-update.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-bootctl@.service": {"name": "systemd-bootctl@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-coredump@.service": {"name": "systemd-coredump@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-creds@.service": {"name": "systemd-creds@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-exit.service": {"name": "systemd-exit.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-fsck@.service": {"name": "systemd-fsck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-growfs-root.service": {"name": "systemd-growfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-growfs@.service": {"name": "systemd-growfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-halt.service": {"name": "systemd-halt.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hibernate.service": {"name": "systemd-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-hybrid-sleep.service": {"name": "systemd-hybrid-sleep.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-journald-sync@.service": {"name": "systemd-journald-sync@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-journald@.service": {"name": "systemd-journald@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-kexec.service": {"name": "systemd-kexec.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-localed.service": {"name": "systemd-localed.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrextend@.service": {"name": "systemd-pcrextend@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrfs-root.service": {"name": "systemd-pcrfs-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-pcrfs@.service": {"name": "systemd-pcrfs@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-pcrlock-file-system.service": {"name": "systemd-pcrlock-file-system.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-code.service": {"name": "systemd-pcrlock-firmware-code.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-firmware-config.service": {"name": "systemd-pcrlock-firmware-config.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-machine-id.service": {"name": "systemd-pcrlock-machine-id.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-make-policy.service": {"name": "systemd-pcrlock-make-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-authority.service": {"name": "systemd-pcrlock-secureboot-authority.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock-secureboot-policy.service": {"name": "systemd-pcrlock-secureboot-policy.service", "state": "inactive", "status": "disabled", "source": "systemd"}, "systemd-pcrlock@.service": {"name": "systemd-pcrlock@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-poweroff.service": {"name": "systemd-poweroff.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-quotacheck@.service": {"name": "systemd-quotacheck@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-reboot.service": {"name": "systemd-reboot.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend-then-hibernate.service": {"name": "systemd-suspend-then-hibernate.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-suspend.service": {"name": "systemd-suspend.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-sysext@.service": {"name": "systemd-sysext@.service", "state": "unknown", "status": "static", "source": "systemd"}, "systemd-sysupdate-reboot.service": {"name": "systemd-sysupdate-reboot.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-sysupdate.service": {"name": "systemd-sysupdate.service", "state": "inactive", "status": "indirect", "source": "systemd"}, "systemd-timedated.service": {"name": "systemd-timedated.service", "state": "inactive", "status": "static", "source": "systemd"}, "systemd-volatile-root.service": {"name": "systemd-volatile-root.service", "state": "inactive", "status": "static", "source": "systemd"}, "user-runtime-dir@.service": {"name": "user-runtime-dir@.service", "state": "unknown", "status": "static", "source": "systemd"}, "user@.service": {"name": "user@.service", "state": "unknown", "status": "static", "source": "systemd"}}}, "invocation": {"module_args": {}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882197.18939: done with _execute_module (service_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'service_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882195.5591476-12724-100009215206979/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882197.18957: _low_level_execute_command(): starting 11579 1726882197.18967: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882195.5591476-12724-100009215206979/ > /dev/null 2>&1 && sleep 0' 11579 1726882197.19634: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882197.19650: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882197.19665: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882197.19685: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882197.19709: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882197.19722: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882197.19736: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882197.19813: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882197.19848: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882197.19867: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882197.19884: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882197.19957: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882197.21859: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882197.21863: stdout chunk (state=3): >>><<< 11579 1726882197.21866: stderr chunk (state=3): >>><<< 11579 1726882197.21906: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882197.21910: handler run complete 11579 1726882197.22298: variable 'ansible_facts' from source: unknown 11579 1726882197.22818: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882197.23925: variable 'ansible_facts' from source: unknown 11579 1726882197.24226: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882197.24647: attempt loop complete, returning result 11579 1726882197.24651: _execute() done 11579 1726882197.24653: dumping result to json 11579 1726882197.25035: done dumping result, returning 11579 1726882197.25045: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which services are running [12673a56-9f93-f197-7423-000000000496] 11579 1726882197.25050: sending task result for task 12673a56-9f93-f197-7423-000000000496 11579 1726882197.27130: done sending task result for task 12673a56-9f93-f197-7423-000000000496 11579 1726882197.27133: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11579 1726882197.27242: no more pending results, returning what we have 11579 1726882197.27245: results queue empty 11579 1726882197.27245: checking for any_errors_fatal 11579 1726882197.27254: done checking for any_errors_fatal 11579 1726882197.27255: checking for max_fail_percentage 11579 1726882197.27257: done checking for max_fail_percentage 11579 1726882197.27258: checking to see if all hosts have failed and the running result is not ok 11579 1726882197.27258: done checking to see if all hosts have failed 11579 1726882197.27259: getting the remaining hosts for this loop 11579 1726882197.27261: done getting the remaining hosts for this loop 11579 1726882197.27264: getting the next task for host managed_node1 11579 1726882197.27269: done getting next task for host managed_node1 11579 1726882197.27273: ^ task is: TASK: fedora.linux_system_roles.network : Check which packages are installed 11579 1726882197.27278: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882197.27290: getting variables 11579 1726882197.27292: in VariableManager get_vars() 11579 1726882197.27326: Calling all_inventory to load vars for managed_node1 11579 1726882197.27328: Calling groups_inventory to load vars for managed_node1 11579 1726882197.27330: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882197.27338: Calling all_plugins_play to load vars for managed_node1 11579 1726882197.27341: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882197.27343: Calling groups_plugins_play to load vars for managed_node1 11579 1726882197.29621: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882197.31345: done with get_vars() 11579 1726882197.31370: done getting variables TASK [fedora.linux_system_roles.network : Check which packages are installed] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 Friday 20 September 2024 21:29:57 -0400 (0:00:01.810) 0:00:26.023 ****** 11579 1726882197.31482: entering _queue_task() for managed_node1/package_facts 11579 1726882197.31989: worker is 1 (out of 1 available) 11579 1726882197.32206: exiting _queue_task() for managed_node1/package_facts 11579 1726882197.32219: done queuing things up, now waiting for results queue to drain 11579 1726882197.32221: waiting for pending results... 11579 1726882197.32772: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed 11579 1726882197.33032: in run() - task 12673a56-9f93-f197-7423-000000000497 11579 1726882197.33110: variable 'ansible_search_path' from source: unknown 11579 1726882197.33160: variable 'ansible_search_path' from source: unknown 11579 1726882197.33212: calling self._execute() 11579 1726882197.33421: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882197.33434: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882197.33454: variable 'omit' from source: magic vars 11579 1726882197.33914: variable 'ansible_distribution_major_version' from source: facts 11579 1726882197.33927: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882197.33933: variable 'omit' from source: magic vars 11579 1726882197.34028: variable 'omit' from source: magic vars 11579 1726882197.34058: variable 'omit' from source: magic vars 11579 1726882197.34161: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882197.34165: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882197.34172: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882197.34178: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882197.34199: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882197.34229: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882197.34232: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882197.34234: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882197.34342: Set connection var ansible_timeout to 10 11579 1726882197.34348: Set connection var ansible_shell_type to sh 11579 1726882197.34355: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882197.34360: Set connection var ansible_shell_executable to /bin/sh 11579 1726882197.34380: Set connection var ansible_pipelining to False 11579 1726882197.34383: Set connection var ansible_connection to ssh 11579 1726882197.34397: variable 'ansible_shell_executable' from source: unknown 11579 1726882197.34401: variable 'ansible_connection' from source: unknown 11579 1726882197.34404: variable 'ansible_module_compression' from source: unknown 11579 1726882197.34406: variable 'ansible_shell_type' from source: unknown 11579 1726882197.34408: variable 'ansible_shell_executable' from source: unknown 11579 1726882197.34410: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882197.34416: variable 'ansible_pipelining' from source: unknown 11579 1726882197.34418: variable 'ansible_timeout' from source: unknown 11579 1726882197.34423: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882197.34632: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11579 1726882197.34643: variable 'omit' from source: magic vars 11579 1726882197.34648: starting attempt loop 11579 1726882197.34651: running the handler 11579 1726882197.34665: _low_level_execute_command(): starting 11579 1726882197.34673: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882197.35813: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882197.35818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882197.35821: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882197.35963: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882197.36023: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882197.37585: stdout chunk (state=3): >>>/root <<< 11579 1726882197.37866: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882197.37870: stderr chunk (state=3): >>><<< 11579 1726882197.37873: stdout chunk (state=3): >>><<< 11579 1726882197.37876: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882197.37878: _low_level_execute_command(): starting 11579 1726882197.37881: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882197.377645-12819-163828069285698 `" && echo ansible-tmp-1726882197.377645-12819-163828069285698="` echo /root/.ansible/tmp/ansible-tmp-1726882197.377645-12819-163828069285698 `" ) && sleep 0' 11579 1726882197.38464: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882197.38478: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882197.38497: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882197.38518: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882197.38536: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882197.38612: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882197.38649: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882197.38666: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882197.38684: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882197.38807: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882197.40647: stdout chunk (state=3): >>>ansible-tmp-1726882197.377645-12819-163828069285698=/root/.ansible/tmp/ansible-tmp-1726882197.377645-12819-163828069285698 <<< 11579 1726882197.40808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882197.40812: stdout chunk (state=3): >>><<< 11579 1726882197.40816: stderr chunk (state=3): >>><<< 11579 1726882197.40838: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882197.377645-12819-163828069285698=/root/.ansible/tmp/ansible-tmp-1726882197.377645-12819-163828069285698 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882197.40901: variable 'ansible_module_compression' from source: unknown 11579 1726882197.41008: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.package_facts-ZIP_DEFLATED 11579 1726882197.41034: variable 'ansible_facts' from source: unknown 11579 1726882197.41248: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882197.377645-12819-163828069285698/AnsiballZ_package_facts.py 11579 1726882197.41548: Sending initial data 11579 1726882197.41552: Sent initial data (161 bytes) 11579 1726882197.42589: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882197.42700: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882197.42862: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882197.43108: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882197.43130: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882197.43191: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882197.44700: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11579 1726882197.44703: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 <<< 11579 1726882197.44706: stderr chunk (state=3): >>>debug2: Server supports extension "statvfs@openssh.com" revision 2 <<< 11579 1726882197.44764: stderr chunk (state=3): >>>debug2: Server supports extension "fstatvfs@openssh.com" revision 2 <<< 11579 1726882197.44767: stderr chunk (state=3): >>>debug2: Server supports extension "hardlink@openssh.com" revision 1 <<< 11579 1726882197.44769: stderr chunk (state=3): >>>debug2: Server supports extension "fsync@openssh.com" revision 1 <<< 11579 1726882197.44771: stderr chunk (state=3): >>>debug2: Server supports extension "lsetstat@openssh.com" revision 1 <<< 11579 1726882197.44773: stderr chunk (state=3): >>>debug2: Server supports extension "limits@openssh.com" revision 1 <<< 11579 1726882197.44775: stderr chunk (state=3): >>>debug2: Server supports extension "expand-path@openssh.com" revision 1 <<< 11579 1726882197.44777: stderr chunk (state=3): >>>debug2: Server supports extension "copy-data" revision 1 <<< 11579 1726882197.44779: stderr chunk (state=3): >>>debug2: Unrecognised server extension "home-directory" <<< 11579 1726882197.44780: stderr chunk (state=3): >>>debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882197.44844: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882197.44905: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmphn_gtig2 /root/.ansible/tmp/ansible-tmp-1726882197.377645-12819-163828069285698/AnsiballZ_package_facts.py <<< 11579 1726882197.44929: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882197.377645-12819-163828069285698/AnsiballZ_package_facts.py" <<< 11579 1726882197.44980: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory <<< 11579 1726882197.44997: stderr chunk (state=3): >>>debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmphn_gtig2" to remote "/root/.ansible/tmp/ansible-tmp-1726882197.377645-12819-163828069285698/AnsiballZ_package_facts.py" <<< 11579 1726882197.45012: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882197.377645-12819-163828069285698/AnsiballZ_package_facts.py" <<< 11579 1726882197.46976: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882197.46980: stdout chunk (state=3): >>><<< 11579 1726882197.46982: stderr chunk (state=3): >>><<< 11579 1726882197.47102: done transferring module to remote 11579 1726882197.47106: _low_level_execute_command(): starting 11579 1726882197.47109: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882197.377645-12819-163828069285698/ /root/.ansible/tmp/ansible-tmp-1726882197.377645-12819-163828069285698/AnsiballZ_package_facts.py && sleep 0' 11579 1726882197.47745: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882197.47808: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882197.47869: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882197.47883: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882197.47912: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882197.48018: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882197.49798: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882197.49802: stdout chunk (state=3): >>><<< 11579 1726882197.49804: stderr chunk (state=3): >>><<< 11579 1726882197.49806: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882197.49809: _low_level_execute_command(): starting 11579 1726882197.49811: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882197.377645-12819-163828069285698/AnsiballZ_package_facts.py && sleep 0' 11579 1726882197.50450: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882197.50454: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882197.50468: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 11579 1726882197.50473: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882197.50564: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882197.50571: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882197.50613: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882197.94630: stdout chunk (state=3): >>> {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10<<< 11579 1726882197.94900: stdout chunk (state=3): >>>", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} <<< 11579 1726882197.96314: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882197.96377: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 11579 1726882197.96444: stdout chunk (state=3): >>><<< 11579 1726882197.96456: stderr chunk (state=3): >>><<< 11579 1726882197.96506: _low_level_execute_command() done: rc=0, stdout= {"ansible_facts": {"packages": {"libgcc": [{"name": "libgcc", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "linux-firmware-whence": [{"name": "linux-firmware-whence", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tzdata": [{"name": "tzdata", "version": "2024a", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "fonts-filesystem": [{"name": "fonts-filesystem", "version": "2.0.5", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "hunspell-filesystem": [{"name": "hunspell-filesystem", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "google-noto-fonts-common": [{"name": "google-noto-fonts-common", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-mono-vf-fonts": [{"name": "google-noto-sans-mono-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-sans-vf-fonts": [{"name": "google-noto-sans-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "google-noto-serif-vf-fonts": [{"name": "google-noto-serif-vf-fonts", "version": "20240401", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-mono-vf-fonts": [{"name": "redhat-mono-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "redhat-text-vf-fonts": [{"name": "redhat-text-vf-fonts", "version": "4.0.3", "release": "12.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "default-fonts-core-sans": [{"name": "default-fonts-core-sans", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-fonts-en": [{"name": "langpacks-fonts-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-ucode-firmware": [{"name": "amd-ucode-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "atheros-firmware": [{"name": "atheros-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "brcmfmac-firmware": [{"name": "brcmfmac-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "cirrus-audio-firmware": [{"name": "cirrus-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-audio-firmware": [{"name": "intel-audio-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "mt7xxx-firmware": [{"name": "mt7xxx-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nxpwireless-firmware": [{"name": "nxpwireless-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "realtek-firmware": [{"name": "realtek-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tiwilink-firmware": [{"name": "tiwilink-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "amd-gpu-firmware": [{"name": "amd-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "intel-gpu-firmware": [{"name": "intel-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "nvidia-gpu-firmware": [{"name": "nvidia-gpu-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "linux-firmware": [{"name": "linux-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "xkeyboard-config": [{"name": "xkeyboard-config", "version": "2.41", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "gawk-all-langpacks": [{"name": "gawk-all-langpacks", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-data": [{"name": "vim-data", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "publicsuffix-list-dafsa": [{"name": "publicsuffix-list-dafsa", "version": "20240107", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "pcre2-syntax": [{"name": "pcre2-syntax", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "noarch", "source": "rpm"}], "ncurses-base": [{"name": "ncurses-base", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libssh-config": [{"name": "libssh-config", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-misc": [{"name": "kbd-misc", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kbd-legacy": [{"name": "kbd-legacy", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hwdata": [{"name": "hwdata", "version": "0.379", "release": "10.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "firewalld-filesystem": [{"name": "firewalld-filesystem", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf-data": [{"name": "dnf-data", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "coreutils-common": [{"name": "coreutils-common", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "centos-gpg-keys": [{"name": "centos-gpg-keys", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-repos": [{"name": "centos-stream-repos", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "centos-stream-release": [{"name": "centos-stream-release", "version": "10.0", "release": "0.19.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "setup": [{"name": "setup", "version": "2.14.5", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "filesystem": [{"name": "filesystem", "version": "3.18", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "basesystem": [{"name": "basesystem", "version": "11", "release": "21.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "glibc-gconv-extra": [{"name": "glibc-gconv-extra", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-langpack-en": [{"name": "glibc-langpack-en", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc-common": [{"name": "glibc-common", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glibc": [{"name": "glibc", "version": "2.39", "release": "17.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses-libs": [{"name": "ncurses-libs", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bash": [{"name": "bash", "version": "5.2.26", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "zlib-ng-compat": [{"name": "zlib-ng-compat", "version": "2.1.6", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libuuid": [{"name": "libuuid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz-libs": [{"name": "xz-libs", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libblkid": [{"name": "libblkid", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libstdc++": [{"name": "libstdc++", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "popt": [{"name": "popt", "version": "1.19", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libzstd": [{"name": "libzstd", "version": "1.5.5", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libelf": [{"name": "elfutils-libelf", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "readline": [{"name": "readline", "version": "8.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "bzip2-libs": [{"name": "bzip2-libs", "version": "1.0.8", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcom_err": [{"name": "libcom_err", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmnl": [{"name": "libmnl", "version": "1.0.5", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxcrypt": [{"name": "libxcrypt", "version": "4.4.36", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crypto-policies": [{"name": "crypto-policies", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "alternatives": [{"name": "alternatives", "version": "1.30", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libxml2": [{"name": "libxml2", "version": "2.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng": [{"name": "libcap-ng", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit-libs": [{"name": "audit-libs", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgpg-error": [{"name": "libgpg-error", "version": "1.50", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtalloc": [{"name": "libtalloc", "version": "2.4.2", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcre2": [{"name": "pcre2", "version": "10.44", "release": "1.el10.2", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grep": [{"name": "grep", "version": "3.11", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sqlite-libs": [{"name": "sqlite-libs", "version": "3.46.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm-libs": [{"name": "gdbm-libs", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libffi": [{"name": "libffi", "version": "3.4.4", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libunistring": [{"name": "libunistring", "version": "1.1", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libidn2": [{"name": "libidn2", "version": "2.3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-common": [{"name": "grub2-common", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libedit": [{"name": "libedit", "version": "3.1", "release": "51.20230828cvs.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "expat": [{"name": "expat", "version": "2.6.2", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gmp": [{"name": "gmp", "version": "6.2.1", "release": "9.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "jansson": [{"name": "jansson", "version": "2.14", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "json-c": [{"name": "json-c", "version": "0.17", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libattr": [{"name": "libattr", "version": "2.5.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libacl": [{"name": "libacl", "version": "2.3.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsepol": [{"name": "libsepol", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libselinux": [{"name": "libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sed": [{"name": "sed", "version": "4.9", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmount": [{"name": "libmount", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsmartcols": [{"name": "libsmartcols", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "findutils": [{"name": "findutils", "version": "4.10.0", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libsemanage": [{"name": "libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtevent": [{"name": "libtevent", "version": "0.16.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libassuan": [{"name": "libassuan", "version": "2.5.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbpf": [{"name": "libbpf", "version": "1.5.0", "release": "1.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "hunspell-en-GB": [{"name": "hunspell-en-GB", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell-en-US": [{"name": "hunspell-en-US", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "hunspell": [{"name": "hunspell", "version": "1.7.2", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfdisk": [{"name": "libfdisk", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils-libs": [{"name": "keyutils-libs", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libeconf": [{"name": "libeconf", "version": "0.6.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pam-libs": [{"name": "pam-libs", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap": [{"name": "libcap", "version": "2.69", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-libs": [{"name": "systemd-libs", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "shadow-utils": [{"name": "shadow-utils", "version": "4.15.0", "release": "3.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "util-linux-core": [{"name": "util-linux-core", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-libs": [{"name": "dbus-libs", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libtasn1": [{"name": "libtasn1", "version": "4.19.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit": [{"name": "p11-kit", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "p11-kit-trust": [{"name": "p11-kit-trust", "version": "0.25.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnutls": [{"name": "gnutls", "version": "3.8.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "glib2": [{"name": "glib2", "version": "2.80.4", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-libs": [{"name": "polkit-libs", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-libnm": [{"name": "NetworkManager-libnm", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "openssl-libs": [{"name": "openssl-libs", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "coreutils": [{"name": "coreutils", "version": "9.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ca-certificates": [{"name": "ca-certificates", "version": "2024.2.69_v8.0.303", "release": "101.1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "tpm2-tss": [{"name": "tpm2-tss", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gzip": [{"name": "gzip", "version": "1.13", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod": [{"name": "kmod", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kmod-libs": [{"name": "kmod-libs", "version": "31", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib": [{"name": "cracklib", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-lib": [{"name": "cyrus-sasl-lib", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgcrypt": [{"name": "libgcrypt", "version": "1.11.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libksba": [{"name": "libksba", "version": "1.6.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnftnl": [{"name": "libnftnl", "version": "1.2.7", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file-libs": [{"name": "file-libs", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "file": [{"name": "file", "version": "5.45", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "diffutils": [{"name": "diffutils", "version": "3.10", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbasicobjects": [{"name": "libbasicobjects", "version": "0.1.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcollection": [{"name": "libcollection", "version": "0.7.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdhash": [{"name": "libdhash", "version": "0.5.0", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnl3": [{"name": "libnl3", "version": "3.9.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libref_array": [{"name": "libref_array", "version": "0.1.5", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libseccomp": [{"name": "libseccomp", "version": "2.5.3", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_idmap": [{"name": "libsss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libtdb": [{"name": "libtdb", "version": "1.4.10", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lua-libs": [{"name": "lua-libs", "version": "5.4.6", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lz4-libs": [{"name": "lz4-libs", "version": "1.9.4", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libarchive": [{"name": "libarchive", "version": "3.7.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lzo": [{"name": "lzo", "version": "2.10", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "npth": [{"name": "npth", "version": "1.6", "release": "19.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "numactl-libs": [{"name": "numactl-libs", "version": "2.0.16", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "squashfs-tools": [{"name": "squashfs-tools", "version": "4.6.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cracklib-dicts": [{"name": "cracklib-dicts", "version": "2.9.11", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpwquality": [{"name": "libpwquality", "version": "1.4.5", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ima-evm-utils": [{"name": "ima-evm-utils", "version": "1.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-pip-wheel": [{"name": "python3-pip-wheel", "version": "23.3.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "which": [{"name": "which", "version": "2.21", "release": "42.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libevent": [{"name": "libevent", "version": "2.1.12", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openldap": [{"name": "openldap", "version": "2.6.7", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_certmap": [{"name": "libsss_certmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sequoia": [{"name": "rpm-sequoia", "version": "1.6.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-audit": [{"name": "rpm-plugin-audit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-libs": [{"name": "rpm-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsolv": [{"name": "libsolv", "version": "0.7.29", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-systemd-inhibit": [{"name": "rpm-plugin-systemd-inhibit", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gobject-introspection": [{"name": "gobject-introspection", "version": "1.79.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsecret": [{"name": "libsecret", "version": "0.21.2", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pinentry": [{"name": "pinentry", "version": "1.3.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libusb1": [{"name": "libusb1", "version": "1.0.27", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "procps-ng": [{"name": "procps-ng", "version": "4.0.4", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kbd": [{"name": "kbd", "version": "2.6.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hunspell-en": [{"name": "hunspell-en", "version": "0.20201207", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libselinux-utils": [{"name": "libselinux-utils", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-libs": [{"name": "gettext-libs", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpfr": [{"name": "mpfr", "version": "4.2.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gawk": [{"name": "gawk", "version": "5.3.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcomps": [{"name": "libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc-modules": [{"name": "grub2-pc-modules", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "libpsl": [{"name": "libpsl", "version": "0.21.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gdbm": [{"name": "gdbm", "version": "1.23", "release": "8.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "pam": [{"name": "pam", "version": "1.6.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xz": [{"name": "xz", "version": "5.6.2", "release": "2.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libxkbcommon": [{"name": "libxkbcommon", "version": "1.7.0", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "groff-base": [{"name": "groff-base", "version": "1.23.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ethtool": [{"name": "ethtool", "version": "6.7", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "ipset-libs": [{"name": "ipset-libs", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ipset": [{"name": "ipset", "version": "7.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs-libs": [{"name": "e2fsprogs-libs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libss": [{"name": "libss", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "snappy": [{"name": "snappy", "version": "1.1.10", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pigz": [{"name": "pigz", "version": "2.8", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus-common": [{"name": "dbus-common", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "dbus-broker": [{"name": "dbus-broker", "version": "35", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dbus": [{"name": "dbus", "version": "1.14.10", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "hostname": [{"name": "hostname", "version": "3.23", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-tools-libs": [{"name": "kernel-tools-libs", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "less": [{"name": "less", "version": "661", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "psmisc": [{"name": "psmisc", "version": "23.6", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute": [{"name": "iproute", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "memstrack": [{"name": "memstrack", "version": "0.2.5", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "c-ares": [{"name": "c-ares", "version": "1.25.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cpio": [{"name": "cpio", "version": "2.15", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "duktape": [{"name": "duktape", "version": "2.7.0", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse-libs": [{"name": "fuse-libs", "version": "2.9.9", "release": "22.el10.gating_test1", "epoch": null, "arch": "x86_64", "source": "rpm"}], "fuse3-libs": [{"name": "fuse3-libs", "version": "3.16.2", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-envsubst": [{"name": "gettext-envsubst", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gettext-runtime": [{"name": "gettext-runtime", "version": "0.22.5", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "inih": [{"name": "inih", "version": "58", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libbrotli": [{"name": "libbrotli", "version": "1.1.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcbor": [{"name": "libcbor", "version": "0.11.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfido2": [{"name": "libfido2", "version": "1.14.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libgomp": [{"name": "libgomp", "version": "14.2.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libndp": [{"name": "libndp", "version": "1.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfnetlink": [{"name": "libnfnetlink", "version": "1.0.1", "release": "28.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnetfilter_conntrack": [{"name": "libnetfilter_conntrack", "version": "1.0.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-libs": [{"name": "iptables-libs", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iptables-nft": [{"name": "iptables-nft", "version": "1.8.10", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nftables": [{"name": "nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libnghttp2": [{"name": "libnghttp2", "version": "1.62.1", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpath_utils": [{"name": "libpath_utils", "version": "0.2.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libini_config": [{"name": "libini_config", "version": "1.3.1", "release": "57.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libpipeline": [{"name": "libpipeline", "version": "1.5.7", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_nss_idmap": [{"name": "libsss_nss_idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsss_sudo": [{"name": "libsss_sudo", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "liburing": [{"name": "liburing", "version": "2.5", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto": [{"name": "libverto", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "krb5-libs": [{"name": "krb5-libs", "version": "1.21.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cyrus-sasl-gssapi": [{"name": "cyrus-sasl-gssapi", "version": "2.1.28", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libssh": [{"name": "libssh", "version": "0.10.6", "release": "8.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcurl": [{"name": "libcurl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect-libs": [{"name": "authselect-libs", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cryptsetup-libs": [{"name": "cryptsetup-libs", "version": "2.7.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-libs": [{"name": "device-mapper-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper": [{"name": "device-mapper", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "elfutils-debuginfod-client": [{"name": "elfutils-debuginfod-client", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-libs": [{"name": "elfutils-libs", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "elfutils-default-yama-scope": [{"name": "elfutils-default-yama-scope", "version": "0.191", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libutempter": [{"name": "libutempter", "version": "1.2.1", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-pam": [{"name": "systemd-pam", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "util-linux": [{"name": "util-linux", "version": "2.40.2", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd": [{"name": "systemd", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools-minimal": [{"name": "grub2-tools-minimal", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "cronie-anacron": [{"name": "cronie-anacron", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cronie": [{"name": "cronie", "version": "1.7.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "crontabs": [{"name": "crontabs", "version": "1.11^20190603git9e74f2d", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "polkit": [{"name": "polkit", "version": "125", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "polkit-pkla-compat": [{"name": "polkit-pkla-compat", "version": "0.1", "release": "29.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh": [{"name": "openssh", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils-gold": [{"name": "binutils-gold", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "binutils": [{"name": "binutils", "version": "2.41", "release": "48.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-service": [{"name": "initscripts-service", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "audit-rules": [{"name": "audit-rules", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "audit": [{"name": "audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iputils": [{"name": "iputils", "version": "20240905", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi": [{"name": "libkcapi", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hasher": [{"name": "libkcapi-hasher", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libkcapi-hmaccalc": [{"name": "libkcapi-hmaccalc", "version": "1.5.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "logrotate": [{"name": "logrotate", "version": "3.22.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "makedumpfile": [{"name": "makedumpfile", "version": "1.7.5", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-build-libs": [{"name": "rpm-build-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kpartx": [{"name": "kpartx", "version": "0.9.9", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "curl": [{"name": "curl", "version": "8.9.1", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm": [{"name": "rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "policycoreutils": [{"name": "policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "selinux-policy": [{"name": "selinux-policy", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "selinux-policy-targeted": [{"name": "selinux-policy-targeted", "version": "40.13.9", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "librepo": [{"name": "librepo", "version": "1.18.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tss-fapi": [{"name": "tpm2-tss-fapi", "version": "4.1.3", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tpm2-tools": [{"name": "tpm2-tools", "version": "5.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grubby": [{"name": "grubby", "version": "8.40", "release": "76.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "systemd-udev": [{"name": "systemd-udev", "version": "256", "release": "14.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut": [{"name": "dracut", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "os-prober": [{"name": "os-prober", "version": "1.81", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-tools": [{"name": "grub2-tools", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules-core": [{"name": "kernel-modules-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel-core": [{"name": "kernel-core", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager": [{"name": "NetworkManager", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "kernel-modules": [{"name": "kernel-modules", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-squash": [{"name": "dracut-squash", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-client": [{"name": "sssd-client", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libyaml": [{"name": "libyaml", "version": "0.2.5", "release": "15.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libmodulemd": [{"name": "libmodulemd", "version": "2.15.0", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libdnf": [{"name": "libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lmdb-libs": [{"name": "lmdb-libs", "version": "0.9.32", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libldb": [{"name": "libldb", "version": "2.9.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-common": [{"name": "sssd-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-krb5-common": [{"name": "sssd-krb5-common", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "mpdecimal": [{"name": "mpdecimal", "version": "2.5.1", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python-unversioned-command": [{"name": "python-unversioned-command", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3": [{"name": "python3", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libs": [{"name": "python3-libs", "version": "3.12.5", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dbus": [{"name": "python3-dbus", "version": "1.3.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libdnf": [{"name": "python3-libdnf", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-hawkey": [{"name": "python3-hawkey", "version": "0.73.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-gobject-base-noarch": [{"name": "python3-gobject-base-noarch", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-gobject-base": [{"name": "python3-gobject-base", "version": "3.46.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-libcomps": [{"name": "python3-libcomps", "version": "0.1.21", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo": [{"name": "sudo", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sudo-python-plugin": [{"name": "sudo-python-plugin", "version": "1.9.15", "release": "7.p5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-nftables": [{"name": "python3-nftables", "version": "1.0.9", "release": "4.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "python3-firewall": [{"name": "python3-firewall", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-six": [{"name": "python3-six", "version": "1.16.0", "release": "15.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dateutil": [{"name": "python3-dateutil", "version": "2.8.2", "release": "14.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "python3-systemd": [{"name": "python3-systemd", "version": "235", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libcap-ng-python3": [{"name": "libcap-ng-python3", "version": "0.8.4", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "oniguruma": [{"name": "oniguruma", "version": "6.9.9", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "jq": [{"name": "jq", "version": "1.7.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dracut-network": [{"name": "dracut-network", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kexec-tools": [{"name": "kexec-tools", "version": "2.0.29", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kdump-utils": [{"name": "kdump-utils", "version": "1.0.43", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pciutils-libs": [{"name": "pciutils-libs", "version": "3.13.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-libs": [{"name": "pcsc-lite-libs", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite-ccid": [{"name": "pcsc-lite-ccid", "version": "1.6.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "pcsc-lite": [{"name": "pcsc-lite", "version": "2.2.3", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2-smime": [{"name": "gnupg2-smime", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gnupg2": [{"name": "gnupg2", "version": "2.4.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-sign-libs": [{"name": "rpm-sign-libs", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpm": [{"name": "python3-rpm", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-dnf": [{"name": "python3-dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "dnf": [{"name": "dnf", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-dnf-plugins-core": [{"name": "python3-dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "sg3_utils-libs": [{"name": "sg3_utils-libs", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "slang": [{"name": "slang", "version": "2.3.3", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "newt": [{"name": "newt", "version": "0.52.24", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "userspace-rcu": [{"name": "userspace-rcu", "version": "0.14.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libestr": [{"name": "libestr", "version": "0.1.11", "release": "10.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libfastjson": [{"name": "libfastjson", "version": "1.2304.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "langpacks-core-en": [{"name": "langpacks-core-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "langpacks-en": [{"name": "langpacks-en", "version": "4.1", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rsyslog": [{"name": "rsyslog", "version": "8.2408.0", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xfsprogs": [{"name": "xfsprogs", "version": "6.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "NetworkManager-tui": [{"name": "NetworkManager-tui", "version": "1.48.10", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "sg3_utils": [{"name": "sg3_utils", "version": "1.48", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "dnf-plugins-core": [{"name": "dnf-plugins-core", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "yum": [{"name": "yum", "version": "4.20.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "kernel-tools": [{"name": "kernel-tools", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "firewalld": [{"name": "firewalld", "version": "2.2.1", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "crypto-policies-scripts": [{"name": "crypto-policies-scripts", "version": "20240822", "release": "1.git367040b.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libselinux": [{"name": "python3-libselinux", "version": "3.7", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "sssd-kcm": [{"name": "sssd-kcm", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "kernel": [{"name": "kernel", "version": "6.11.0", "release": "0.rc6.23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "grub2-pc": [{"name": "grub2-pc", "version": "2.06", "release": "127.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dracut-config-rescue": [{"name": "dracut-config-rescue", "version": "102", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-clients": [{"name": "openssh-clients", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "openssh-server": [{"name": "openssh-server", "version": "9.8p1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "chrony": [{"name": "chrony", "version": "4.6", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "microcode_ctl": [{"name": "microcode_ctl", "version": "20240531", "release": "1.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "qemu-guest-agent": [{"name": "qemu-guest-agent", "version": "9.0.0", "release": "8.el10", "epoch": 18, "arch": "x86_64", "source": "rpm"}], "parted": [{"name": "parted", "version": "3.6", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "authselect": [{"name": "authselect", "version": "1.5.0", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "man-db": [{"name": "man-db", "version": "2.12.0", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iproute-tc": [{"name": "iproute-tc", "version": "6.7.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "e2fsprogs": [{"name": "e2fsprogs", "version": "1.47.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "initscripts-rename-device": [{"name": "initscripts-rename-device", "version": "10.26", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rpm-plugin-selinux": [{"name": "rpm-plugin-selinux", "version": "4.19.1.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "irqbalance": [{"name": "irqbalance", "version": "1.9.4", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "prefixdevname": [{"name": "prefixdevname", "version": "0.2.0", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-minimal": [{"name": "vim-minimal", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "lshw": [{"name": "lshw", "version": "B.02.20", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "ncurses": [{"name": "ncurses", "version": "6.4", "release": "13.20240127.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libsysfs": [{"name": "libsysfs", "version": "2.1.1", "release": "13.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lsscsi": [{"name": "lsscsi", "version": "0.32", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "iwlwifi-dvm-firmware": [{"name": "iwlwifi-dvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "iwlwifi-mvm-firmware": [{"name": "iwlwifi-mvm-firmware", "version": "20240910", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rootfiles": [{"name": "rootfiles", "version": "8.1", "release": "37.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "libtirpc": [{"name": "libtirpc", "version": "1.3.5", "release": "0.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "git-core": [{"name": "git-core", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libnfsidmap": [{"name": "libnfsidmap", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "git-core-doc": [{"name": "git-core-doc", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "rpcbind": [{"name": "rpcbind", "version": "1.2.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Digest": [{"name": "perl-Digest", "version": "1.20", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Digest-MD5": [{"name": "perl-Digest-MD5", "version": "2.59", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-B": [{"name": "perl-B", "version": "1.89", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-FileHandle": [{"name": "perl-FileHandle", "version": "2.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Data-Dumper": [{"name": "perl-Data-Dumper", "version": "2.189", "release": "511.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-libnet": [{"name": "perl-libnet", "version": "3.15", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-URI": [{"name": "perl-URI", "version": "5.27", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-AutoLoader": [{"name": "perl-AutoLoader", "version": "5.74", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Text-Tabs+Wrap": [{"name": "perl-Text-Tabs+Wrap", "version": "2024.001", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Mozilla-CA": [{"name": "perl-Mozilla-CA", "version": "20231213", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-if": [{"name": "perl-if", "version": "0.61.000", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-locale": [{"name": "perl-locale", "version": "1.12", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-IP": [{"name": "perl-IO-Socket-IP", "version": "0.42", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Time-Local": [{"name": "perl-Time-Local", "version": "1.350", "release": "510.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "perl-File-Path": [{"name": "perl-File-Path", "version": "2.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Escapes": [{"name": "perl-Pod-Escapes", "version": "1.07", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-IO-Socket-SSL": [{"name": "perl-IO-Socket-SSL", "version": "2.085", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Net-SSLeay": [{"name": "perl-Net-SSLeay", "version": "1.94", "release": "6.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Class-Struct": [{"name": "perl-Class-Struct", "version": "0.68", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Term-ANSIColor": [{"name": "perl-Term-ANSIColor", "version": "5.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-POSIX": [{"name": "perl-POSIX", "version": "2.20", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IPC-Open3": [{"name": "perl-IPC-Open3", "version": "1.22", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-Temp": [{"name": "perl-File-Temp", "version": "0.231.100", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Term-Cap": [{"name": "perl-Term-Cap", "version": "1.18", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Pod-Simple": [{"name": "perl-Pod-Simple", "version": "3.45", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-HTTP-Tiny": [{"name": "perl-HTTP-Tiny", "version": "0.088", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Socket": [{"name": "perl-Socket", "version": "2.038", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-SelectSaver": [{"name": "perl-SelectSaver", "version": "1.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Symbol": [{"name": "perl-Symbol", "version": "1.09", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-File-stat": [{"name": "perl-File-stat", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-podlators": [{"name": "perl-podlators", "version": "5.01", "release": "510.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Pod-Perldoc": [{"name": "perl-Pod-Perldoc", "version": "3.28.01", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Fcntl": [{"name": "perl-Fcntl", "version": "1.18", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Text-ParseWords": [{"name": "perl-Text-ParseWords", "version": "3.31", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-base": [{"name": "perl-base", "version": "2.27", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-mro": [{"name": "perl-mro", "version": "1.29", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-IO": [{"name": "perl-IO", "version": "1.55", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-overloading": [{"name": "perl-overloading", "version": "0.02", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Pod-Usage": [{"name": "perl-Pod-Usage", "version": "2.03", "release": "510.el10", "epoch": 4, "arch": "noarch", "source": "rpm"}], "perl-Errno": [{"name": "perl-Errno", "version": "1.38", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-File-Basename": [{"name": "perl-File-Basename", "version": "2.86", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Std": [{"name": "perl-Getopt-Std", "version": "1.14", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-MIME-Base64": [{"name": "perl-MIME-Base64", "version": "3.16", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-Scalar-List-Utils": [{"name": "perl-Scalar-List-Utils", "version": "1.63", "release": "510.el10", "epoch": 5, "arch": "x86_64", "source": "rpm"}], "perl-constant": [{"name": "perl-constant", "version": "1.33", "release": "511.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Storable": [{"name": "perl-Storable", "version": "3.32", "release": "510.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "perl-overload": [{"name": "perl-overload", "version": "1.37", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-parent": [{"name": "perl-parent", "version": "0.241", "release": "511.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-vars": [{"name": "perl-vars", "version": "1.05", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-Getopt-Long": [{"name": "perl-Getopt-Long", "version": "2.58", "release": "2.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-Carp": [{"name": "perl-Carp", "version": "1.54", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-Exporter": [{"name": "perl-Exporter", "version": "5.78", "release": "510.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "perl-PathTools": [{"name": "perl-PathTools", "version": "3.91", "release": "510.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-DynaLoader": [{"name": "perl-DynaLoader", "version": "1.56", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-NDBM_File": [{"name": "perl-NDBM_File", "version": "1.17", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Encode": [{"name": "perl-Encode", "version": "3.21", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-libs": [{"name": "perl-libs", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-interpreter": [{"name": "perl-interpreter", "version": "5.40.0", "release": "510.el10", "epoch": 4, "arch": "x86_64", "source": "rpm"}], "perl-Error": [{"name": "perl-Error", "version": "0.17029", "release": "17.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "perl-File-Find": [{"name": "perl-File-Find", "version": "1.44", "release": "510.el10", "epoch": 0, "arch": "noarch", "source": "rpm"}], "perl-TermReadKey": [{"name": "perl-TermReadKey", "version": "2.38", "release": "23.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "perl-lib": [{"name": "perl-lib", "version": "0.65", "release": "510.el10", "epoch": 0, "arch": "x86_64", "source": "rpm"}], "perl-Git": [{"name": "perl-Git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "git": [{"name": "git", "version": "2.45.2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "xxd": [{"name": "xxd", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "libxslt": [{"name": "libxslt", "version": "1.1.39", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-lxml": [{"name": "python3-lxml", "version": "5.2.1", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "yum-utils": [{"name": "yum-utils", "version": "4.7.0", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "vim-filesystem": [{"name": "vim-filesystem", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "noarch", "source": "rpm"}], "vim-common": [{"name": "vim-common", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "time": [{"name": "time", "version": "1.9", "release": "24.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "tar": [{"name": "tar", "version": "1.35", "release": "4.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "quota-nls": [{"name": "quota-nls", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "noarch", "source": "rpm"}], "quota": [{"name": "quota", "version": "4.09", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "nettle": [{"name": "nettle", "version": "3.10", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wget": [{"name": "wget", "version": "1.24.5", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "make": [{"name": "make", "version": "4.4.1", "release": "7.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "libev": [{"name": "libev", "version": "4.33", "release": "12.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "libverto-libev": [{"name": "libverto-libev", "version": "0.3.2", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "gssproxy": [{"name": "gssproxy", "version": "0.9.2", "release": "7.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "keyutils": [{"name": "keyutils", "version": "1.6.3", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "nfs-utils": [{"name": "nfs-utils", "version": "2.7.1", "release": "1.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "bc": [{"name": "bc", "version": "1.07.1", "release": "22.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "beakerlib-redhat": [{"name": "beakerlib-redhat", "version": "1", "release": "35.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "beakerlib": [{"name": "beakerlib", "version": "1.29.3", "release": "1.el9", "epoch": null, "arch": "noarch", "source": "rpm"}], "restraint": [{"name": "restraint", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "restraint-rhts": [{"name": "restraint-rhts", "version": "0.4.4", "release": "1.el9", "epoch": null, "arch": "x86_64", "source": "rpm"}], "vim-enhanced": [{"name": "vim-enhanced", "version": "9.1.083", "release": "2.el10", "epoch": 2, "arch": "x86_64", "source": "rpm"}], "sssd-nfs-idmap": [{"name": "sssd-nfs-idmap", "version": "2.10.0~beta2", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rsync": [{"name": "rsync", "version": "3.3.0", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-rpds-py": [{"name": "python3-rpds-py", "version": "0.17.1", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-attrs": [{"name": "python3-attrs", "version": "23.2.0", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-referencing": [{"name": "python3-referencing", "version": "0.31.1", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-idna": [{"name": "python3-idna", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-urllib3": [{"name": "python3-urllib3", "version": "1.26.19", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema-specifications": [{"name": "python3-jsonschema-specifications", "version": "2023.11.2", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonschema": [{"name": "python3-jsonschema", "version": "4.19.1", "release": "6.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyserial": [{"name": "python3-pyserial", "version": "3.5", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-oauthlib": [{"name": "python3-oauthlib", "version": "3.2.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-markupsafe": [{"name": "python3-markupsafe", "version": "2.1.3", "release": "5.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jinja2": [{"name": "python3-jinja2", "version": "3.1.4", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-libsemanage": [{"name": "python3-libsemanage", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-jsonpointer": [{"name": "python3-jsonpointer", "version": "2.3", "release": "8.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-jsonpatch": [{"name": "python3-jsonpatch", "version": "1.33", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-distro": [{"name": "python3-distro", "version": "1.9.0", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-configobj": [{"name": "python3-configobj", "version": "5.0.8", "release": "9.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-audit": [{"name": "python3-audit", "version": "4.0", "release": "9.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "checkpolicy": [{"name": "checkpolicy", "version": "3.7", "release": "1.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-setuptools": [{"name": "python3-setuptools", "version": "69.0.3", "release": "5.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-setools": [{"name": "python3-setools", "version": "4.5.1", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-policycoreutils": [{"name": "python3-policycoreutils", "version": "3.7", "release": "2.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-pyyaml": [{"name": "python3-pyyaml", "version": "6.0.1", "release": "18.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "python3-charset-normalizer": [{"name": "python3-charset-normalizer", "version": "3.3.2", "release": "4.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "python3-requests": [{"name": "python3-requests", "version": "2.32.3", "release": "1.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "openssl": [{"name": "openssl", "version": "3.2.2", "release": "12.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dhcpcd": [{"name": "dhcpcd", "version": "10.0.6", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "cloud-init": [{"name": "cloud-init", "version": "24.1.4", "release": "17.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "device-mapper-event-libs": [{"name": "device-mapper-event-libs", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "libaio": [{"name": "libaio", "version": "0.3.111", "release": "20.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "device-mapper-event": [{"name": "device-mapper-event", "version": "1.02.198", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "lvm2-libs": [{"name": "lvm2-libs", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "device-mapper-persistent-data": [{"name": "device-mapper-persistent-data", "version": "1.0.11", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "lvm2": [{"name": "lvm2", "version": "2.03.24", "release": "2.el10", "epoch": 10, "arch": "x86_64", "source": "rpm"}], "cloud-utils-growpart": [{"name": "cloud-utils-growpart", "version": "0.33", "release": "10.el10", "epoch": null, "arch": "noarch", "source": "rpm"}], "jitterentropy": [{"name": "jitterentropy", "version": "3.5.0", "release": "4.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "rng-tools": [{"name": "rng-tools", "version": "6.17", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "hostapd": [{"name": "hostapd", "version": "2.10", "release": "11.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}], "wpa_supplicant": [{"name": "wpa_supplicant", "version": "2.10", "release": "11.el10", "epoch": 1, "arch": "x86_64", "source": "rpm"}], "dnsmasq": [{"name": "dnsmasq", "version": "2.90", "release": "3.el10", "epoch": null, "arch": "x86_64", "source": "rpm"}]}}, "invocation": {"module_args": {"manager": ["auto"], "strategy": "first"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882198.01669: done with _execute_module (package_facts, {'_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'package_facts', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882197.377645-12819-163828069285698/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882198.01999: _low_level_execute_command(): starting 11579 1726882198.02003: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882197.377645-12819-163828069285698/ > /dev/null 2>&1 && sleep 0' 11579 1726882198.02856: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882198.02910: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882198.02973: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882198.03003: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882198.03023: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882198.03115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882198.04983: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882198.05006: stdout chunk (state=3): >>><<< 11579 1726882198.05020: stderr chunk (state=3): >>><<< 11579 1726882198.05038: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882198.05400: handler run complete 11579 1726882198.06769: variable 'ansible_facts' from source: unknown 11579 1726882198.07867: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882198.12408: variable 'ansible_facts' from source: unknown 11579 1726882198.13314: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882198.14823: attempt loop complete, returning result 11579 1726882198.14845: _execute() done 11579 1726882198.14854: dumping result to json 11579 1726882198.15195: done dumping result, returning 11579 1726882198.15341: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check which packages are installed [12673a56-9f93-f197-7423-000000000497] 11579 1726882198.15352: sending task result for task 12673a56-9f93-f197-7423-000000000497 11579 1726882198.20147: done sending task result for task 12673a56-9f93-f197-7423-000000000497 11579 1726882198.20150: WORKER PROCESS EXITING ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11579 1726882198.20289: no more pending results, returning what we have 11579 1726882198.20292: results queue empty 11579 1726882198.20296: checking for any_errors_fatal 11579 1726882198.20300: done checking for any_errors_fatal 11579 1726882198.20301: checking for max_fail_percentage 11579 1726882198.20302: done checking for max_fail_percentage 11579 1726882198.20303: checking to see if all hosts have failed and the running result is not ok 11579 1726882198.20304: done checking to see if all hosts have failed 11579 1726882198.20304: getting the remaining hosts for this loop 11579 1726882198.20305: done getting the remaining hosts for this loop 11579 1726882198.20308: getting the next task for host managed_node1 11579 1726882198.20315: done getting next task for host managed_node1 11579 1726882198.20318: ^ task is: TASK: fedora.linux_system_roles.network : Print network provider 11579 1726882198.20322: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882198.20332: getting variables 11579 1726882198.20333: in VariableManager get_vars() 11579 1726882198.20367: Calling all_inventory to load vars for managed_node1 11579 1726882198.20370: Calling groups_inventory to load vars for managed_node1 11579 1726882198.20372: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882198.20380: Calling all_plugins_play to load vars for managed_node1 11579 1726882198.20383: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882198.20386: Calling groups_plugins_play to load vars for managed_node1 11579 1726882198.22968: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882198.24947: done with get_vars() 11579 1726882198.24979: done getting variables 11579 1726882198.25054: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Print network provider] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:7 Friday 20 September 2024 21:29:58 -0400 (0:00:00.936) 0:00:26.959 ****** 11579 1726882198.25102: entering _queue_task() for managed_node1/debug 11579 1726882198.25504: worker is 1 (out of 1 available) 11579 1726882198.25519: exiting _queue_task() for managed_node1/debug 11579 1726882198.25530: done queuing things up, now waiting for results queue to drain 11579 1726882198.25532: waiting for pending results... 11579 1726882198.25894: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider 11579 1726882198.26100: in run() - task 12673a56-9f93-f197-7423-00000000007d 11579 1726882198.26105: variable 'ansible_search_path' from source: unknown 11579 1726882198.26108: variable 'ansible_search_path' from source: unknown 11579 1726882198.26111: calling self._execute() 11579 1726882198.26170: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882198.26174: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882198.26500: variable 'omit' from source: magic vars 11579 1726882198.26698: variable 'ansible_distribution_major_version' from source: facts 11579 1726882198.26702: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882198.26705: variable 'omit' from source: magic vars 11579 1726882198.26707: variable 'omit' from source: magic vars 11579 1726882198.26796: variable 'network_provider' from source: set_fact 11579 1726882198.26815: variable 'omit' from source: magic vars 11579 1726882198.26854: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882198.26901: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882198.26929: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882198.26946: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882198.26959: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882198.26999: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882198.27003: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882198.27007: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882198.27117: Set connection var ansible_timeout to 10 11579 1726882198.27123: Set connection var ansible_shell_type to sh 11579 1726882198.27131: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882198.27136: Set connection var ansible_shell_executable to /bin/sh 11579 1726882198.27144: Set connection var ansible_pipelining to False 11579 1726882198.27146: Set connection var ansible_connection to ssh 11579 1726882198.27167: variable 'ansible_shell_executable' from source: unknown 11579 1726882198.27170: variable 'ansible_connection' from source: unknown 11579 1726882198.27173: variable 'ansible_module_compression' from source: unknown 11579 1726882198.27175: variable 'ansible_shell_type' from source: unknown 11579 1726882198.27177: variable 'ansible_shell_executable' from source: unknown 11579 1726882198.27179: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882198.27181: variable 'ansible_pipelining' from source: unknown 11579 1726882198.27186: variable 'ansible_timeout' from source: unknown 11579 1726882198.27190: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882198.27347: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882198.27598: variable 'omit' from source: magic vars 11579 1726882198.27601: starting attempt loop 11579 1726882198.27603: running the handler 11579 1726882198.27605: handler run complete 11579 1726882198.27607: attempt loop complete, returning result 11579 1726882198.27608: _execute() done 11579 1726882198.27610: dumping result to json 11579 1726882198.27612: done dumping result, returning 11579 1726882198.27613: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Print network provider [12673a56-9f93-f197-7423-00000000007d] 11579 1726882198.27615: sending task result for task 12673a56-9f93-f197-7423-00000000007d 11579 1726882198.27677: done sending task result for task 12673a56-9f93-f197-7423-00000000007d 11579 1726882198.27681: WORKER PROCESS EXITING ok: [managed_node1] => {} MSG: Using network provider: nm 11579 1726882198.27756: no more pending results, returning what we have 11579 1726882198.27759: results queue empty 11579 1726882198.27760: checking for any_errors_fatal 11579 1726882198.27767: done checking for any_errors_fatal 11579 1726882198.27768: checking for max_fail_percentage 11579 1726882198.27770: done checking for max_fail_percentage 11579 1726882198.27771: checking to see if all hosts have failed and the running result is not ok 11579 1726882198.27772: done checking to see if all hosts have failed 11579 1726882198.27773: getting the remaining hosts for this loop 11579 1726882198.27774: done getting the remaining hosts for this loop 11579 1726882198.27778: getting the next task for host managed_node1 11579 1726882198.27785: done getting next task for host managed_node1 11579 1726882198.27789: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11579 1726882198.27796: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882198.27808: getting variables 11579 1726882198.27810: in VariableManager get_vars() 11579 1726882198.27966: Calling all_inventory to load vars for managed_node1 11579 1726882198.27969: Calling groups_inventory to load vars for managed_node1 11579 1726882198.27972: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882198.27981: Calling all_plugins_play to load vars for managed_node1 11579 1726882198.27984: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882198.27987: Calling groups_plugins_play to load vars for managed_node1 11579 1726882198.40912: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882198.43613: done with get_vars() 11579 1726882198.43643: done getting variables 11579 1726882198.43691: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:11 Friday 20 September 2024 21:29:58 -0400 (0:00:00.186) 0:00:27.145 ****** 11579 1726882198.43727: entering _queue_task() for managed_node1/fail 11579 1726882198.44169: worker is 1 (out of 1 available) 11579 1726882198.44183: exiting _queue_task() for managed_node1/fail 11579 1726882198.44374: done queuing things up, now waiting for results queue to drain 11579 1726882198.44376: waiting for pending results... 11579 1726882198.44578: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider 11579 1726882198.44862: in run() - task 12673a56-9f93-f197-7423-00000000007e 11579 1726882198.44883: variable 'ansible_search_path' from source: unknown 11579 1726882198.44891: variable 'ansible_search_path' from source: unknown 11579 1726882198.44942: calling self._execute() 11579 1726882198.45055: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882198.45069: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882198.45085: variable 'omit' from source: magic vars 11579 1726882198.45571: variable 'ansible_distribution_major_version' from source: facts 11579 1726882198.45606: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882198.45960: variable 'network_state' from source: role '' defaults 11579 1726882198.45964: Evaluated conditional (network_state != {}): False 11579 1726882198.45967: when evaluation is False, skipping this task 11579 1726882198.45970: _execute() done 11579 1726882198.45972: dumping result to json 11579 1726882198.45975: done dumping result, returning 11579 1726882198.45978: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if using the `network_state` variable with the initscripts provider [12673a56-9f93-f197-7423-00000000007e] 11579 1726882198.45981: sending task result for task 12673a56-9f93-f197-7423-00000000007e 11579 1726882198.46054: done sending task result for task 12673a56-9f93-f197-7423-00000000007e 11579 1726882198.46057: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11579 1726882198.46109: no more pending results, returning what we have 11579 1726882198.46113: results queue empty 11579 1726882198.46113: checking for any_errors_fatal 11579 1726882198.46121: done checking for any_errors_fatal 11579 1726882198.46121: checking for max_fail_percentage 11579 1726882198.46123: done checking for max_fail_percentage 11579 1726882198.46124: checking to see if all hosts have failed and the running result is not ok 11579 1726882198.46125: done checking to see if all hosts have failed 11579 1726882198.46126: getting the remaining hosts for this loop 11579 1726882198.46128: done getting the remaining hosts for this loop 11579 1726882198.46131: getting the next task for host managed_node1 11579 1726882198.46138: done getting next task for host managed_node1 11579 1726882198.46142: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11579 1726882198.46146: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882198.46165: getting variables 11579 1726882198.46167: in VariableManager get_vars() 11579 1726882198.46419: Calling all_inventory to load vars for managed_node1 11579 1726882198.46423: Calling groups_inventory to load vars for managed_node1 11579 1726882198.46425: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882198.46438: Calling all_plugins_play to load vars for managed_node1 11579 1726882198.46440: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882198.46442: Calling groups_plugins_play to load vars for managed_node1 11579 1726882198.48482: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882198.51263: done with get_vars() 11579 1726882198.51295: done getting variables 11579 1726882198.51358: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:18 Friday 20 September 2024 21:29:58 -0400 (0:00:00.076) 0:00:27.222 ****** 11579 1726882198.51398: entering _queue_task() for managed_node1/fail 11579 1726882198.51750: worker is 1 (out of 1 available) 11579 1726882198.51764: exiting _queue_task() for managed_node1/fail 11579 1726882198.51777: done queuing things up, now waiting for results queue to drain 11579 1726882198.51779: waiting for pending results... 11579 1726882198.52420: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 11579 1726882198.52644: in run() - task 12673a56-9f93-f197-7423-00000000007f 11579 1726882198.52649: variable 'ansible_search_path' from source: unknown 11579 1726882198.52652: variable 'ansible_search_path' from source: unknown 11579 1726882198.52713: calling self._execute() 11579 1726882198.53180: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882198.53184: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882198.53187: variable 'omit' from source: magic vars 11579 1726882198.54235: variable 'ansible_distribution_major_version' from source: facts 11579 1726882198.54249: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882198.54429: variable 'network_state' from source: role '' defaults 11579 1726882198.54464: Evaluated conditional (network_state != {}): False 11579 1726882198.54468: when evaluation is False, skipping this task 11579 1726882198.54471: _execute() done 11579 1726882198.54474: dumping result to json 11579 1726882198.54481: done dumping result, returning 11579 1726882198.54485: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying the network state configuration if the system version of the managed host is below 8 [12673a56-9f93-f197-7423-00000000007f] 11579 1726882198.54489: sending task result for task 12673a56-9f93-f197-7423-00000000007f 11579 1726882198.55002: done sending task result for task 12673a56-9f93-f197-7423-00000000007f 11579 1726882198.55005: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11579 1726882198.55040: no more pending results, returning what we have 11579 1726882198.55042: results queue empty 11579 1726882198.55043: checking for any_errors_fatal 11579 1726882198.55047: done checking for any_errors_fatal 11579 1726882198.55048: checking for max_fail_percentage 11579 1726882198.55049: done checking for max_fail_percentage 11579 1726882198.55050: checking to see if all hosts have failed and the running result is not ok 11579 1726882198.55051: done checking to see if all hosts have failed 11579 1726882198.55052: getting the remaining hosts for this loop 11579 1726882198.55053: done getting the remaining hosts for this loop 11579 1726882198.55056: getting the next task for host managed_node1 11579 1726882198.55063: done getting next task for host managed_node1 11579 1726882198.55067: ^ task is: TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11579 1726882198.55071: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=8, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882198.55088: getting variables 11579 1726882198.55089: in VariableManager get_vars() 11579 1726882198.55129: Calling all_inventory to load vars for managed_node1 11579 1726882198.55132: Calling groups_inventory to load vars for managed_node1 11579 1726882198.55135: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882198.55144: Calling all_plugins_play to load vars for managed_node1 11579 1726882198.55146: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882198.55149: Calling groups_plugins_play to load vars for managed_node1 11579 1726882198.56929: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882198.58672: done with get_vars() 11579 1726882198.58700: done getting variables 11579 1726882198.58768: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:25 Friday 20 September 2024 21:29:58 -0400 (0:00:00.074) 0:00:27.296 ****** 11579 1726882198.58810: entering _queue_task() for managed_node1/fail 11579 1726882198.59298: worker is 1 (out of 1 available) 11579 1726882198.59310: exiting _queue_task() for managed_node1/fail 11579 1726882198.59320: done queuing things up, now waiting for results queue to drain 11579 1726882198.59321: waiting for pending results... 11579 1726882198.59523: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later 11579 1726882198.59679: in run() - task 12673a56-9f93-f197-7423-000000000080 11579 1726882198.59802: variable 'ansible_search_path' from source: unknown 11579 1726882198.59807: variable 'ansible_search_path' from source: unknown 11579 1726882198.59810: calling self._execute() 11579 1726882198.59859: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882198.59864: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882198.59879: variable 'omit' from source: magic vars 11579 1726882198.60306: variable 'ansible_distribution_major_version' from source: facts 11579 1726882198.60325: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882198.60505: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882198.62741: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882198.62822: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882198.62858: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882198.62905: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882198.62930: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882198.63017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882198.63045: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882198.63068: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882198.63199: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882198.63203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882198.63230: variable 'ansible_distribution_major_version' from source: facts 11579 1726882198.63243: Evaluated conditional (ansible_distribution_major_version | int > 9): True 11579 1726882198.63349: variable 'ansible_distribution' from source: facts 11579 1726882198.63352: variable '__network_rh_distros' from source: role '' defaults 11579 1726882198.63361: Evaluated conditional (ansible_distribution in __network_rh_distros): True 11579 1726882198.63640: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882198.63674: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882198.63702: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882198.63760: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882198.63763: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882198.63814: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882198.63837: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882198.63860: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882198.63987: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882198.63989: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882198.63992: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882198.64002: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882198.64028: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882198.64065: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882198.64089: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882198.64436: variable 'network_connections' from source: task vars 11579 1726882198.64447: variable 'port2_profile' from source: play vars 11579 1726882198.64514: variable 'port2_profile' from source: play vars 11579 1726882198.64536: variable 'port1_profile' from source: play vars 11579 1726882198.64588: variable 'port1_profile' from source: play vars 11579 1726882198.64598: variable 'controller_profile' from source: play vars 11579 1726882198.64667: variable 'controller_profile' from source: play vars 11579 1726882198.64698: variable 'network_state' from source: role '' defaults 11579 1726882198.64757: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882198.64950: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882198.65059: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882198.65066: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882198.65069: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882198.65112: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882198.65133: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882198.65158: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882198.65196: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882198.65227: Evaluated conditional (network_connections | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0 or network_state.get("interfaces", []) | selectattr("type", "defined") | selectattr("type", "match", "^team$") | list | length > 0): False 11579 1726882198.65230: when evaluation is False, skipping this task 11579 1726882198.65233: _execute() done 11579 1726882198.65235: dumping result to json 11579 1726882198.65237: done dumping result, returning 11579 1726882198.65277: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Abort applying teaming configuration if the system version of the managed host is EL10 or later [12673a56-9f93-f197-7423-000000000080] 11579 1726882198.65281: sending task result for task 12673a56-9f93-f197-7423-000000000080 11579 1726882198.65356: done sending task result for task 12673a56-9f93-f197-7423-000000000080 11579 1726882198.65358: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_connections | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0 or network_state.get(\"interfaces\", []) | selectattr(\"type\", \"defined\") | selectattr(\"type\", \"match\", \"^team$\") | list | length > 0", "skip_reason": "Conditional result was False" } 11579 1726882198.65444: no more pending results, returning what we have 11579 1726882198.65447: results queue empty 11579 1726882198.65448: checking for any_errors_fatal 11579 1726882198.65455: done checking for any_errors_fatal 11579 1726882198.65456: checking for max_fail_percentage 11579 1726882198.65458: done checking for max_fail_percentage 11579 1726882198.65459: checking to see if all hosts have failed and the running result is not ok 11579 1726882198.65460: done checking to see if all hosts have failed 11579 1726882198.65461: getting the remaining hosts for this loop 11579 1726882198.65462: done getting the remaining hosts for this loop 11579 1726882198.65466: getting the next task for host managed_node1 11579 1726882198.65474: done getting next task for host managed_node1 11579 1726882198.65478: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11579 1726882198.65482: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=9, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882198.65710: getting variables 11579 1726882198.65712: in VariableManager get_vars() 11579 1726882198.65751: Calling all_inventory to load vars for managed_node1 11579 1726882198.65755: Calling groups_inventory to load vars for managed_node1 11579 1726882198.65757: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882198.65766: Calling all_plugins_play to load vars for managed_node1 11579 1726882198.65769: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882198.65772: Calling groups_plugins_play to load vars for managed_node1 11579 1726882198.67158: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882198.68788: done with get_vars() 11579 1726882198.68814: done getting variables 11579 1726882198.68874: Loading ActionModule 'dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:36 Friday 20 September 2024 21:29:58 -0400 (0:00:00.100) 0:00:27.397 ****** 11579 1726882198.68912: entering _queue_task() for managed_node1/dnf 11579 1726882198.69262: worker is 1 (out of 1 available) 11579 1726882198.69391: exiting _queue_task() for managed_node1/dnf 11579 1726882198.69406: done queuing things up, now waiting for results queue to drain 11579 1726882198.69408: waiting for pending results... 11579 1726882198.69810: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces 11579 1726882198.69815: in run() - task 12673a56-9f93-f197-7423-000000000081 11579 1726882198.69818: variable 'ansible_search_path' from source: unknown 11579 1726882198.69825: variable 'ansible_search_path' from source: unknown 11579 1726882198.69875: calling self._execute() 11579 1726882198.69980: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882198.69984: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882198.69999: variable 'omit' from source: magic vars 11579 1726882198.70421: variable 'ansible_distribution_major_version' from source: facts 11579 1726882198.70434: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882198.70648: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882198.73637: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882198.73763: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882198.73766: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882198.73794: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882198.73823: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882198.73913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882198.73977: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882198.73981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882198.74017: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882198.74032: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882198.74299: variable 'ansible_distribution' from source: facts 11579 1726882198.74302: variable 'ansible_distribution_major_version' from source: facts 11579 1726882198.74304: Evaluated conditional (ansible_distribution == 'Fedora' or ansible_distribution_major_version | int > 7): True 11579 1726882198.74307: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882198.74417: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882198.74448: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882198.74473: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882198.74575: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882198.74578: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882198.74581: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882198.74799: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882198.74804: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882198.74873: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882198.74877: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882198.74928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882198.74951: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882198.75095: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882198.75136: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882198.75150: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882198.75822: variable 'network_connections' from source: task vars 11579 1726882198.75834: variable 'port2_profile' from source: play vars 11579 1726882198.76001: variable 'port2_profile' from source: play vars 11579 1726882198.76004: variable 'port1_profile' from source: play vars 11579 1726882198.76007: variable 'port1_profile' from source: play vars 11579 1726882198.76009: variable 'controller_profile' from source: play vars 11579 1726882198.76179: variable 'controller_profile' from source: play vars 11579 1726882198.76295: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882198.76740: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882198.76931: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882198.76962: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882198.77068: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882198.77173: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882198.77399: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882198.77404: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882198.77406: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882198.77520: variable '__network_team_connections_defined' from source: role '' defaults 11579 1726882198.78111: variable 'network_connections' from source: task vars 11579 1726882198.78114: variable 'port2_profile' from source: play vars 11579 1726882198.78302: variable 'port2_profile' from source: play vars 11579 1726882198.78306: variable 'port1_profile' from source: play vars 11579 1726882198.78372: variable 'port1_profile' from source: play vars 11579 1726882198.78380: variable 'controller_profile' from source: play vars 11579 1726882198.78702: variable 'controller_profile' from source: play vars 11579 1726882198.78705: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11579 1726882198.78707: when evaluation is False, skipping this task 11579 1726882198.78709: _execute() done 11579 1726882198.78711: dumping result to json 11579 1726882198.78713: done dumping result, returning 11579 1726882198.78715: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the DNF package manager due to wireless or team interfaces [12673a56-9f93-f197-7423-000000000081] 11579 1726882198.78717: sending task result for task 12673a56-9f93-f197-7423-000000000081 11579 1726882198.78836: done sending task result for task 12673a56-9f93-f197-7423-000000000081 11579 1726882198.78839: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11579 1726882198.78912: no more pending results, returning what we have 11579 1726882198.78916: results queue empty 11579 1726882198.78917: checking for any_errors_fatal 11579 1726882198.78924: done checking for any_errors_fatal 11579 1726882198.78925: checking for max_fail_percentage 11579 1726882198.78927: done checking for max_fail_percentage 11579 1726882198.78928: checking to see if all hosts have failed and the running result is not ok 11579 1726882198.78929: done checking to see if all hosts have failed 11579 1726882198.78930: getting the remaining hosts for this loop 11579 1726882198.78931: done getting the remaining hosts for this loop 11579 1726882198.78935: getting the next task for host managed_node1 11579 1726882198.78943: done getting next task for host managed_node1 11579 1726882198.78947: ^ task is: TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11579 1726882198.78951: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=10, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882198.79010: getting variables 11579 1726882198.79013: in VariableManager get_vars() 11579 1726882198.79055: Calling all_inventory to load vars for managed_node1 11579 1726882198.79059: Calling groups_inventory to load vars for managed_node1 11579 1726882198.79061: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882198.79305: Calling all_plugins_play to load vars for managed_node1 11579 1726882198.79310: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882198.79314: Calling groups_plugins_play to load vars for managed_node1 11579 1726882198.81171: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882198.84238: done with get_vars() 11579 1726882198.84267: done getting variables redirecting (type: action) ansible.builtin.yum to ansible.builtin.dnf 11579 1726882198.84398: Loading ActionModule 'ansible_collections.ansible.builtin.plugins.action.dnf' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/dnf.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:48 Friday 20 September 2024 21:29:58 -0400 (0:00:00.155) 0:00:27.552 ****** 11579 1726882198.84430: entering _queue_task() for managed_node1/yum 11579 1726882198.85018: worker is 1 (out of 1 available) 11579 1726882198.85148: exiting _queue_task() for managed_node1/yum 11579 1726882198.85161: done queuing things up, now waiting for results queue to drain 11579 1726882198.85162: waiting for pending results... 11579 1726882198.85490: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces 11579 1726882198.85719: in run() - task 12673a56-9f93-f197-7423-000000000082 11579 1726882198.85734: variable 'ansible_search_path' from source: unknown 11579 1726882198.85738: variable 'ansible_search_path' from source: unknown 11579 1726882198.85781: calling self._execute() 11579 1726882198.85997: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882198.86009: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882198.86020: variable 'omit' from source: magic vars 11579 1726882198.86928: variable 'ansible_distribution_major_version' from source: facts 11579 1726882198.86932: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882198.87053: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882198.90489: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882198.90575: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882198.90620: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882198.90660: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882198.90762: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882198.90870: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882198.90874: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882198.90876: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882198.90897: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882198.90900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882198.91004: variable 'ansible_distribution_major_version' from source: facts 11579 1726882198.91019: Evaluated conditional (ansible_distribution_major_version | int < 8): False 11579 1726882198.91023: when evaluation is False, skipping this task 11579 1726882198.91026: _execute() done 11579 1726882198.91028: dumping result to json 11579 1726882198.91033: done dumping result, returning 11579 1726882198.91041: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Check if updates for network packages are available through the YUM package manager due to wireless or team interfaces [12673a56-9f93-f197-7423-000000000082] 11579 1726882198.91044: sending task result for task 12673a56-9f93-f197-7423-000000000082 11579 1726882198.91272: done sending task result for task 12673a56-9f93-f197-7423-000000000082 11579 1726882198.91276: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "ansible_distribution_major_version | int < 8", "skip_reason": "Conditional result was False" } 11579 1726882198.91337: no more pending results, returning what we have 11579 1726882198.91340: results queue empty 11579 1726882198.91341: checking for any_errors_fatal 11579 1726882198.91346: done checking for any_errors_fatal 11579 1726882198.91347: checking for max_fail_percentage 11579 1726882198.91348: done checking for max_fail_percentage 11579 1726882198.91349: checking to see if all hosts have failed and the running result is not ok 11579 1726882198.91350: done checking to see if all hosts have failed 11579 1726882198.91351: getting the remaining hosts for this loop 11579 1726882198.91353: done getting the remaining hosts for this loop 11579 1726882198.91356: getting the next task for host managed_node1 11579 1726882198.91364: done getting next task for host managed_node1 11579 1726882198.91369: ^ task is: TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11579 1726882198.91373: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=11, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882198.91399: getting variables 11579 1726882198.91401: in VariableManager get_vars() 11579 1726882198.91449: Calling all_inventory to load vars for managed_node1 11579 1726882198.91452: Calling groups_inventory to load vars for managed_node1 11579 1726882198.91455: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882198.91466: Calling all_plugins_play to load vars for managed_node1 11579 1726882198.91469: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882198.91472: Calling groups_plugins_play to load vars for managed_node1 11579 1726882198.95263: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882198.99153: done with get_vars() 11579 1726882198.99180: done getting variables 11579 1726882198.99288: Loading ActionModule 'fail' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/fail.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:60 Friday 20 September 2024 21:29:58 -0400 (0:00:00.148) 0:00:27.701 ****** 11579 1726882198.99327: entering _queue_task() for managed_node1/fail 11579 1726882199.00221: worker is 1 (out of 1 available) 11579 1726882199.00236: exiting _queue_task() for managed_node1/fail 11579 1726882199.00250: done queuing things up, now waiting for results queue to drain 11579 1726882199.00252: waiting for pending results... 11579 1726882199.00926: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces 11579 1726882199.01125: in run() - task 12673a56-9f93-f197-7423-000000000083 11579 1726882199.01149: variable 'ansible_search_path' from source: unknown 11579 1726882199.01158: variable 'ansible_search_path' from source: unknown 11579 1726882199.01204: calling self._execute() 11579 1726882199.01435: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882199.01555: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882199.01558: variable 'omit' from source: magic vars 11579 1726882199.02328: variable 'ansible_distribution_major_version' from source: facts 11579 1726882199.02347: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882199.02672: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882199.03084: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882199.07447: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882199.07633: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882199.07672: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882199.07709: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882199.07852: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882199.07934: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.08023: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.08047: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.08208: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.08223: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.08271: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.08413: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.08437: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.08476: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.08607: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.08649: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.08672: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.08699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.08848: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.08862: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.09208: variable 'network_connections' from source: task vars 11579 1726882199.09223: variable 'port2_profile' from source: play vars 11579 1726882199.09412: variable 'port2_profile' from source: play vars 11579 1726882199.09469: variable 'port1_profile' from source: play vars 11579 1726882199.09590: variable 'port1_profile' from source: play vars 11579 1726882199.09601: variable 'controller_profile' from source: play vars 11579 1726882199.09656: variable 'controller_profile' from source: play vars 11579 1726882199.09777: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882199.10400: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882199.10404: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882199.10406: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882199.10409: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882199.10411: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882199.10414: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882199.10416: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.10418: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882199.10421: variable '__network_team_connections_defined' from source: role '' defaults 11579 1726882199.10653: variable 'network_connections' from source: task vars 11579 1726882199.10656: variable 'port2_profile' from source: play vars 11579 1726882199.10724: variable 'port2_profile' from source: play vars 11579 1726882199.10732: variable 'port1_profile' from source: play vars 11579 1726882199.10815: variable 'port1_profile' from source: play vars 11579 1726882199.10824: variable 'controller_profile' from source: play vars 11579 1726882199.10888: variable 'controller_profile' from source: play vars 11579 1726882199.11042: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11579 1726882199.11054: when evaluation is False, skipping this task 11579 1726882199.11056: _execute() done 11579 1726882199.11059: dumping result to json 11579 1726882199.11061: done dumping result, returning 11579 1726882199.11063: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ask user's consent to restart NetworkManager due to wireless or team interfaces [12673a56-9f93-f197-7423-000000000083] 11579 1726882199.11065: sending task result for task 12673a56-9f93-f197-7423-000000000083 11579 1726882199.11168: done sending task result for task 12673a56-9f93-f197-7423-000000000083 11579 1726882199.11171: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11579 1726882199.11345: no more pending results, returning what we have 11579 1726882199.11349: results queue empty 11579 1726882199.11349: checking for any_errors_fatal 11579 1726882199.11356: done checking for any_errors_fatal 11579 1726882199.11356: checking for max_fail_percentage 11579 1726882199.11358: done checking for max_fail_percentage 11579 1726882199.11359: checking to see if all hosts have failed and the running result is not ok 11579 1726882199.11360: done checking to see if all hosts have failed 11579 1726882199.11361: getting the remaining hosts for this loop 11579 1726882199.11362: done getting the remaining hosts for this loop 11579 1726882199.11366: getting the next task for host managed_node1 11579 1726882199.11374: done getting next task for host managed_node1 11579 1726882199.11378: ^ task is: TASK: fedora.linux_system_roles.network : Install packages 11579 1726882199.11382: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=12, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882199.11404: getting variables 11579 1726882199.11406: in VariableManager get_vars() 11579 1726882199.11448: Calling all_inventory to load vars for managed_node1 11579 1726882199.11451: Calling groups_inventory to load vars for managed_node1 11579 1726882199.11454: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882199.11466: Calling all_plugins_play to load vars for managed_node1 11579 1726882199.11469: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882199.11472: Calling groups_plugins_play to load vars for managed_node1 11579 1726882199.13762: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882199.15754: done with get_vars() 11579 1726882199.15780: done getting variables 11579 1726882199.15837: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install packages] ******************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:73 Friday 20 September 2024 21:29:59 -0400 (0:00:00.165) 0:00:27.867 ****** 11579 1726882199.15871: entering _queue_task() for managed_node1/package 11579 1726882199.17152: worker is 1 (out of 1 available) 11579 1726882199.17164: exiting _queue_task() for managed_node1/package 11579 1726882199.17368: done queuing things up, now waiting for results queue to drain 11579 1726882199.17370: waiting for pending results... 11579 1726882199.17869: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages 11579 1726882199.17945: in run() - task 12673a56-9f93-f197-7423-000000000084 11579 1726882199.17958: variable 'ansible_search_path' from source: unknown 11579 1726882199.17962: variable 'ansible_search_path' from source: unknown 11579 1726882199.18100: calling self._execute() 11579 1726882199.18104: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882199.18107: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882199.18110: variable 'omit' from source: magic vars 11579 1726882199.18478: variable 'ansible_distribution_major_version' from source: facts 11579 1726882199.18501: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882199.18699: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882199.18983: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882199.19035: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882199.19069: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882199.19150: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882199.19264: variable 'network_packages' from source: role '' defaults 11579 1726882199.19380: variable '__network_provider_setup' from source: role '' defaults 11579 1726882199.19391: variable '__network_service_name_default_nm' from source: role '' defaults 11579 1726882199.19453: variable '__network_service_name_default_nm' from source: role '' defaults 11579 1726882199.19469: variable '__network_packages_default_nm' from source: role '' defaults 11579 1726882199.19574: variable '__network_packages_default_nm' from source: role '' defaults 11579 1726882199.19907: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882199.22471: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882199.22541: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882199.22573: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882199.22604: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882199.22635: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882199.22805: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.22809: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.22812: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.22815: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.22817: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.22866: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.22889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.23000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.23004: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.23006: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.23260: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11579 1726882199.23440: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.23463: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.23651: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.23654: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.23676: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.23949: variable 'ansible_python' from source: facts 11579 1726882199.23973: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11579 1726882199.24135: variable '__network_wpa_supplicant_required' from source: role '' defaults 11579 1726882199.24214: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11579 1726882199.24558: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.24668: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.24701: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.24738: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.24752: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.24913: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.25200: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.25203: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.25205: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.25207: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.25380: variable 'network_connections' from source: task vars 11579 1726882199.25384: variable 'port2_profile' from source: play vars 11579 1726882199.25644: variable 'port2_profile' from source: play vars 11579 1726882199.25647: variable 'port1_profile' from source: play vars 11579 1726882199.25828: variable 'port1_profile' from source: play vars 11579 1726882199.26099: variable 'controller_profile' from source: play vars 11579 1726882199.26142: variable 'controller_profile' from source: play vars 11579 1726882199.26304: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882199.26336: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882199.26363: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.26514: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882199.26567: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882199.27282: variable 'network_connections' from source: task vars 11579 1726882199.27288: variable 'port2_profile' from source: play vars 11579 1726882199.27459: variable 'port2_profile' from source: play vars 11579 1726882199.27472: variable 'port1_profile' from source: play vars 11579 1726882199.27591: variable 'port1_profile' from source: play vars 11579 1726882199.27623: variable 'controller_profile' from source: play vars 11579 1726882199.27741: variable 'controller_profile' from source: play vars 11579 1726882199.27774: variable '__network_packages_default_wireless' from source: role '' defaults 11579 1726882199.27861: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882199.28210: variable 'network_connections' from source: task vars 11579 1726882199.28213: variable 'port2_profile' from source: play vars 11579 1726882199.28281: variable 'port2_profile' from source: play vars 11579 1726882199.28290: variable 'port1_profile' from source: play vars 11579 1726882199.28355: variable 'port1_profile' from source: play vars 11579 1726882199.28361: variable 'controller_profile' from source: play vars 11579 1726882199.28432: variable 'controller_profile' from source: play vars 11579 1726882199.28456: variable '__network_packages_default_team' from source: role '' defaults 11579 1726882199.28542: variable '__network_team_connections_defined' from source: role '' defaults 11579 1726882199.28860: variable 'network_connections' from source: task vars 11579 1726882199.28863: variable 'port2_profile' from source: play vars 11579 1726882199.28934: variable 'port2_profile' from source: play vars 11579 1726882199.28941: variable 'port1_profile' from source: play vars 11579 1726882199.29003: variable 'port1_profile' from source: play vars 11579 1726882199.29010: variable 'controller_profile' from source: play vars 11579 1726882199.29077: variable 'controller_profile' from source: play vars 11579 1726882199.29131: variable '__network_service_name_default_initscripts' from source: role '' defaults 11579 1726882199.29400: variable '__network_service_name_default_initscripts' from source: role '' defaults 11579 1726882199.29403: variable '__network_packages_default_initscripts' from source: role '' defaults 11579 1726882199.29406: variable '__network_packages_default_initscripts' from source: role '' defaults 11579 1726882199.29516: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11579 1726882199.30117: variable 'network_connections' from source: task vars 11579 1726882199.30120: variable 'port2_profile' from source: play vars 11579 1726882199.30291: variable 'port2_profile' from source: play vars 11579 1726882199.30299: variable 'port1_profile' from source: play vars 11579 1726882199.30345: variable 'port1_profile' from source: play vars 11579 1726882199.30353: variable 'controller_profile' from source: play vars 11579 1726882199.30418: variable 'controller_profile' from source: play vars 11579 1726882199.30427: variable 'ansible_distribution' from source: facts 11579 1726882199.30430: variable '__network_rh_distros' from source: role '' defaults 11579 1726882199.30436: variable 'ansible_distribution_major_version' from source: facts 11579 1726882199.30452: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11579 1726882199.30720: variable 'ansible_distribution' from source: facts 11579 1726882199.30723: variable '__network_rh_distros' from source: role '' defaults 11579 1726882199.30726: variable 'ansible_distribution_major_version' from source: facts 11579 1726882199.30728: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11579 1726882199.30792: variable 'ansible_distribution' from source: facts 11579 1726882199.30805: variable '__network_rh_distros' from source: role '' defaults 11579 1726882199.30811: variable 'ansible_distribution_major_version' from source: facts 11579 1726882199.30846: variable 'network_provider' from source: set_fact 11579 1726882199.30860: variable 'ansible_facts' from source: unknown 11579 1726882199.31531: Evaluated conditional (not network_packages is subset(ansible_facts.packages.keys())): False 11579 1726882199.31535: when evaluation is False, skipping this task 11579 1726882199.31537: _execute() done 11579 1726882199.31539: dumping result to json 11579 1726882199.31599: done dumping result, returning 11579 1726882199.31602: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install packages [12673a56-9f93-f197-7423-000000000084] 11579 1726882199.31604: sending task result for task 12673a56-9f93-f197-7423-000000000084 11579 1726882199.31708: done sending task result for task 12673a56-9f93-f197-7423-000000000084 11579 1726882199.31711: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "not network_packages is subset(ansible_facts.packages.keys())", "skip_reason": "Conditional result was False" } 11579 1726882199.31766: no more pending results, returning what we have 11579 1726882199.31770: results queue empty 11579 1726882199.31771: checking for any_errors_fatal 11579 1726882199.31779: done checking for any_errors_fatal 11579 1726882199.31780: checking for max_fail_percentage 11579 1726882199.31783: done checking for max_fail_percentage 11579 1726882199.31783: checking to see if all hosts have failed and the running result is not ok 11579 1726882199.31785: done checking to see if all hosts have failed 11579 1726882199.31785: getting the remaining hosts for this loop 11579 1726882199.31787: done getting the remaining hosts for this loop 11579 1726882199.31901: getting the next task for host managed_node1 11579 1726882199.31911: done getting next task for host managed_node1 11579 1726882199.31915: ^ task is: TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11579 1726882199.31921: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=13, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882199.31939: getting variables 11579 1726882199.31941: in VariableManager get_vars() 11579 1726882199.31984: Calling all_inventory to load vars for managed_node1 11579 1726882199.31987: Calling groups_inventory to load vars for managed_node1 11579 1726882199.31990: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882199.32106: Calling all_plugins_play to load vars for managed_node1 11579 1726882199.32114: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882199.32119: Calling groups_plugins_play to load vars for managed_node1 11579 1726882199.33823: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882199.35921: done with get_vars() 11579 1726882199.35949: done getting variables 11579 1726882199.36218: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:85 Friday 20 September 2024 21:29:59 -0400 (0:00:00.203) 0:00:28.070 ****** 11579 1726882199.36257: entering _queue_task() for managed_node1/package 11579 1726882199.36828: worker is 1 (out of 1 available) 11579 1726882199.36904: exiting _queue_task() for managed_node1/package 11579 1726882199.36918: done queuing things up, now waiting for results queue to drain 11579 1726882199.36920: waiting for pending results... 11579 1726882199.37304: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable 11579 1726882199.37353: in run() - task 12673a56-9f93-f197-7423-000000000085 11579 1726882199.37371: variable 'ansible_search_path' from source: unknown 11579 1726882199.37378: variable 'ansible_search_path' from source: unknown 11579 1726882199.37431: calling self._execute() 11579 1726882199.37520: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882199.37600: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882199.37604: variable 'omit' from source: magic vars 11579 1726882199.37915: variable 'ansible_distribution_major_version' from source: facts 11579 1726882199.37935: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882199.38066: variable 'network_state' from source: role '' defaults 11579 1726882199.38086: Evaluated conditional (network_state != {}): False 11579 1726882199.38098: when evaluation is False, skipping this task 11579 1726882199.38107: _execute() done 11579 1726882199.38115: dumping result to json 11579 1726882199.38123: done dumping result, returning 11579 1726882199.38136: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install NetworkManager and nmstate when using network_state variable [12673a56-9f93-f197-7423-000000000085] 11579 1726882199.38147: sending task result for task 12673a56-9f93-f197-7423-000000000085 11579 1726882199.38262: done sending task result for task 12673a56-9f93-f197-7423-000000000085 skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11579 1726882199.38312: no more pending results, returning what we have 11579 1726882199.38316: results queue empty 11579 1726882199.38317: checking for any_errors_fatal 11579 1726882199.38323: done checking for any_errors_fatal 11579 1726882199.38324: checking for max_fail_percentage 11579 1726882199.38325: done checking for max_fail_percentage 11579 1726882199.38326: checking to see if all hosts have failed and the running result is not ok 11579 1726882199.38327: done checking to see if all hosts have failed 11579 1726882199.38328: getting the remaining hosts for this loop 11579 1726882199.38329: done getting the remaining hosts for this loop 11579 1726882199.38332: getting the next task for host managed_node1 11579 1726882199.38340: done getting next task for host managed_node1 11579 1726882199.38343: ^ task is: TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11579 1726882199.38347: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=14, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882199.38370: getting variables 11579 1726882199.38372: in VariableManager get_vars() 11579 1726882199.38414: Calling all_inventory to load vars for managed_node1 11579 1726882199.38417: Calling groups_inventory to load vars for managed_node1 11579 1726882199.38419: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882199.38432: Calling all_plugins_play to load vars for managed_node1 11579 1726882199.38435: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882199.38438: Calling groups_plugins_play to load vars for managed_node1 11579 1726882199.38956: WORKER PROCESS EXITING 11579 1726882199.40814: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882199.43624: done with get_vars() 11579 1726882199.43656: done getting variables 11579 1726882199.43839: Loading ActionModule 'package' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/package.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:96 Friday 20 September 2024 21:29:59 -0400 (0:00:00.076) 0:00:28.147 ****** 11579 1726882199.43877: entering _queue_task() for managed_node1/package 11579 1726882199.44620: worker is 1 (out of 1 available) 11579 1726882199.44634: exiting _queue_task() for managed_node1/package 11579 1726882199.44645: done queuing things up, now waiting for results queue to drain 11579 1726882199.44647: waiting for pending results... 11579 1726882199.45286: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable 11579 1726882199.45711: in run() - task 12673a56-9f93-f197-7423-000000000086 11579 1726882199.45735: variable 'ansible_search_path' from source: unknown 11579 1726882199.45744: variable 'ansible_search_path' from source: unknown 11579 1726882199.46003: calling self._execute() 11579 1726882199.46115: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882199.46128: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882199.46143: variable 'omit' from source: magic vars 11579 1726882199.47119: variable 'ansible_distribution_major_version' from source: facts 11579 1726882199.47137: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882199.47414: variable 'network_state' from source: role '' defaults 11579 1726882199.47430: Evaluated conditional (network_state != {}): False 11579 1726882199.47438: when evaluation is False, skipping this task 11579 1726882199.47445: _execute() done 11579 1726882199.47454: dumping result to json 11579 1726882199.47461: done dumping result, returning 11579 1726882199.47473: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Install python3-libnmstate when using network_state variable [12673a56-9f93-f197-7423-000000000086] 11579 1726882199.47899: sending task result for task 12673a56-9f93-f197-7423-000000000086 11579 1726882199.47980: done sending task result for task 12673a56-9f93-f197-7423-000000000086 11579 1726882199.47983: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11579 1726882199.48035: no more pending results, returning what we have 11579 1726882199.48038: results queue empty 11579 1726882199.48039: checking for any_errors_fatal 11579 1726882199.48043: done checking for any_errors_fatal 11579 1726882199.48044: checking for max_fail_percentage 11579 1726882199.48045: done checking for max_fail_percentage 11579 1726882199.48046: checking to see if all hosts have failed and the running result is not ok 11579 1726882199.48047: done checking to see if all hosts have failed 11579 1726882199.48047: getting the remaining hosts for this loop 11579 1726882199.48049: done getting the remaining hosts for this loop 11579 1726882199.48051: getting the next task for host managed_node1 11579 1726882199.48057: done getting next task for host managed_node1 11579 1726882199.48060: ^ task is: TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11579 1726882199.48064: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=15, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882199.48080: getting variables 11579 1726882199.48081: in VariableManager get_vars() 11579 1726882199.48118: Calling all_inventory to load vars for managed_node1 11579 1726882199.48120: Calling groups_inventory to load vars for managed_node1 11579 1726882199.48122: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882199.48131: Calling all_plugins_play to load vars for managed_node1 11579 1726882199.48133: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882199.48135: Calling groups_plugins_play to load vars for managed_node1 11579 1726882199.49824: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882199.51362: done with get_vars() 11579 1726882199.51386: done getting variables 11579 1726882199.51450: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:109 Friday 20 September 2024 21:29:59 -0400 (0:00:00.076) 0:00:28.223 ****** 11579 1726882199.51486: entering _queue_task() for managed_node1/service 11579 1726882199.51927: worker is 1 (out of 1 available) 11579 1726882199.51937: exiting _queue_task() for managed_node1/service 11579 1726882199.51946: done queuing things up, now waiting for results queue to drain 11579 1726882199.51947: waiting for pending results... 11579 1726882199.52136: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces 11579 1726882199.52310: in run() - task 12673a56-9f93-f197-7423-000000000087 11579 1726882199.52330: variable 'ansible_search_path' from source: unknown 11579 1726882199.52337: variable 'ansible_search_path' from source: unknown 11579 1726882199.52376: calling self._execute() 11579 1726882199.52480: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882199.52497: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882199.52516: variable 'omit' from source: magic vars 11579 1726882199.52883: variable 'ansible_distribution_major_version' from source: facts 11579 1726882199.52939: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882199.53020: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882199.53227: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882199.55447: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882199.55528: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882199.55617: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882199.55621: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882199.55637: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882199.55718: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.55753: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.55779: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.55825: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.55845: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.55892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.55942: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.55946: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.55981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.56000: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.56097: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.56105: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.56115: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.56158: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.56177: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.56368: variable 'network_connections' from source: task vars 11579 1726882199.56387: variable 'port2_profile' from source: play vars 11579 1726882199.56464: variable 'port2_profile' from source: play vars 11579 1726882199.56481: variable 'port1_profile' from source: play vars 11579 1726882199.56551: variable 'port1_profile' from source: play vars 11579 1726882199.56649: variable 'controller_profile' from source: play vars 11579 1726882199.56652: variable 'controller_profile' from source: play vars 11579 1726882199.56709: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882199.56902: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882199.56943: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882199.56981: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882199.57019: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882199.57063: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882199.57097: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882199.57129: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.57157: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882199.57223: variable '__network_team_connections_defined' from source: role '' defaults 11579 1726882199.57451: variable 'network_connections' from source: task vars 11579 1726882199.57462: variable 'port2_profile' from source: play vars 11579 1726882199.57530: variable 'port2_profile' from source: play vars 11579 1726882199.57626: variable 'port1_profile' from source: play vars 11579 1726882199.57630: variable 'port1_profile' from source: play vars 11579 1726882199.57632: variable 'controller_profile' from source: play vars 11579 1726882199.57672: variable 'controller_profile' from source: play vars 11579 1726882199.57705: Evaluated conditional (__network_wireless_connections_defined or __network_team_connections_defined): False 11579 1726882199.57723: when evaluation is False, skipping this task 11579 1726882199.57735: _execute() done 11579 1726882199.57743: dumping result to json 11579 1726882199.57749: done dumping result, returning 11579 1726882199.57759: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Restart NetworkManager due to wireless or team interfaces [12673a56-9f93-f197-7423-000000000087] 11579 1726882199.57767: sending task result for task 12673a56-9f93-f197-7423-000000000087 11579 1726882199.58031: done sending task result for task 12673a56-9f93-f197-7423-000000000087 11579 1726882199.58034: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wireless_connections_defined or __network_team_connections_defined", "skip_reason": "Conditional result was False" } 11579 1726882199.58082: no more pending results, returning what we have 11579 1726882199.58085: results queue empty 11579 1726882199.58086: checking for any_errors_fatal 11579 1726882199.58097: done checking for any_errors_fatal 11579 1726882199.58098: checking for max_fail_percentage 11579 1726882199.58101: done checking for max_fail_percentage 11579 1726882199.58101: checking to see if all hosts have failed and the running result is not ok 11579 1726882199.58102: done checking to see if all hosts have failed 11579 1726882199.58103: getting the remaining hosts for this loop 11579 1726882199.58105: done getting the remaining hosts for this loop 11579 1726882199.58108: getting the next task for host managed_node1 11579 1726882199.58117: done getting next task for host managed_node1 11579 1726882199.58122: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11579 1726882199.58126: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=16, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882199.58148: getting variables 11579 1726882199.58150: in VariableManager get_vars() 11579 1726882199.58383: Calling all_inventory to load vars for managed_node1 11579 1726882199.58387: Calling groups_inventory to load vars for managed_node1 11579 1726882199.58390: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882199.58403: Calling all_plugins_play to load vars for managed_node1 11579 1726882199.58406: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882199.58409: Calling groups_plugins_play to load vars for managed_node1 11579 1726882199.59751: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882199.61421: done with get_vars() 11579 1726882199.61444: done getting variables 11579 1726882199.61509: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start NetworkManager] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Friday 20 September 2024 21:29:59 -0400 (0:00:00.100) 0:00:28.323 ****** 11579 1726882199.61543: entering _queue_task() for managed_node1/service 11579 1726882199.61883: worker is 1 (out of 1 available) 11579 1726882199.62003: exiting _queue_task() for managed_node1/service 11579 1726882199.62014: done queuing things up, now waiting for results queue to drain 11579 1726882199.62016: waiting for pending results... 11579 1726882199.62204: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager 11579 1726882199.62353: in run() - task 12673a56-9f93-f197-7423-000000000088 11579 1726882199.62375: variable 'ansible_search_path' from source: unknown 11579 1726882199.62383: variable 'ansible_search_path' from source: unknown 11579 1726882199.62428: calling self._execute() 11579 1726882199.62530: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882199.62540: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882199.62556: variable 'omit' from source: magic vars 11579 1726882199.63100: variable 'ansible_distribution_major_version' from source: facts 11579 1726882199.63103: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882199.63114: variable 'network_provider' from source: set_fact 11579 1726882199.63123: variable 'network_state' from source: role '' defaults 11579 1726882199.63134: Evaluated conditional (network_provider == "nm" or network_state != {}): True 11579 1726882199.63142: variable 'omit' from source: magic vars 11579 1726882199.63203: variable 'omit' from source: magic vars 11579 1726882199.63242: variable 'network_service_name' from source: role '' defaults 11579 1726882199.63312: variable 'network_service_name' from source: role '' defaults 11579 1726882199.63431: variable '__network_provider_setup' from source: role '' defaults 11579 1726882199.63444: variable '__network_service_name_default_nm' from source: role '' defaults 11579 1726882199.63511: variable '__network_service_name_default_nm' from source: role '' defaults 11579 1726882199.63525: variable '__network_packages_default_nm' from source: role '' defaults 11579 1726882199.63590: variable '__network_packages_default_nm' from source: role '' defaults 11579 1726882199.63870: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882199.66052: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882199.66139: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882199.66183: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882199.66225: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882199.66251: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882199.66336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.66371: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.66402: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.66444: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.66479: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.66517: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.66541: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.66565: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.66699: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.66703: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.66855: variable '__network_packages_default_gobject_packages' from source: role '' defaults 11579 1726882199.66981: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.67014: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.67048: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.67090: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.67116: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.67213: variable 'ansible_python' from source: facts 11579 1726882199.67245: variable '__network_packages_default_wpa_supplicant' from source: role '' defaults 11579 1726882199.67331: variable '__network_wpa_supplicant_required' from source: role '' defaults 11579 1726882199.67417: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11579 1726882199.67550: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.67583: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.67618: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.67660: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.67700: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.67741: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882199.67889: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882199.67892: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.67900: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882199.67902: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882199.68009: variable 'network_connections' from source: task vars 11579 1726882199.68027: variable 'port2_profile' from source: play vars 11579 1726882199.68126: variable 'port2_profile' from source: play vars 11579 1726882199.68129: variable 'port1_profile' from source: play vars 11579 1726882199.68202: variable 'port1_profile' from source: play vars 11579 1726882199.68217: variable 'controller_profile' from source: play vars 11579 1726882199.68283: variable 'controller_profile' from source: play vars 11579 1726882199.68453: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882199.68608: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882199.68657: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882199.68717: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882199.68761: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882199.68834: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882199.68869: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882199.68917: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882199.68955: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882199.69017: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882199.69284: variable 'network_connections' from source: task vars 11579 1726882199.69326: variable 'port2_profile' from source: play vars 11579 1726882199.69384: variable 'port2_profile' from source: play vars 11579 1726882199.69409: variable 'port1_profile' from source: play vars 11579 1726882199.69490: variable 'port1_profile' from source: play vars 11579 1726882199.69544: variable 'controller_profile' from source: play vars 11579 1726882199.69597: variable 'controller_profile' from source: play vars 11579 1726882199.69637: variable '__network_packages_default_wireless' from source: role '' defaults 11579 1726882199.69727: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882199.70087: variable 'network_connections' from source: task vars 11579 1726882199.70090: variable 'port2_profile' from source: play vars 11579 1726882199.70135: variable 'port2_profile' from source: play vars 11579 1726882199.70148: variable 'port1_profile' from source: play vars 11579 1726882199.70226: variable 'port1_profile' from source: play vars 11579 1726882199.70239: variable 'controller_profile' from source: play vars 11579 1726882199.70319: variable 'controller_profile' from source: play vars 11579 1726882199.70348: variable '__network_packages_default_team' from source: role '' defaults 11579 1726882199.70600: variable '__network_team_connections_defined' from source: role '' defaults 11579 1726882199.70740: variable 'network_connections' from source: task vars 11579 1726882199.70750: variable 'port2_profile' from source: play vars 11579 1726882199.70829: variable 'port2_profile' from source: play vars 11579 1726882199.70842: variable 'port1_profile' from source: play vars 11579 1726882199.70919: variable 'port1_profile' from source: play vars 11579 1726882199.70937: variable 'controller_profile' from source: play vars 11579 1726882199.71011: variable 'controller_profile' from source: play vars 11579 1726882199.71073: variable '__network_service_name_default_initscripts' from source: role '' defaults 11579 1726882199.71144: variable '__network_service_name_default_initscripts' from source: role '' defaults 11579 1726882199.71200: variable '__network_packages_default_initscripts' from source: role '' defaults 11579 1726882199.71229: variable '__network_packages_default_initscripts' from source: role '' defaults 11579 1726882199.71458: variable '__network_packages_default_initscripts_bridge' from source: role '' defaults 11579 1726882199.71953: variable 'network_connections' from source: task vars 11579 1726882199.71962: variable 'port2_profile' from source: play vars 11579 1726882199.72028: variable 'port2_profile' from source: play vars 11579 1726882199.72039: variable 'port1_profile' from source: play vars 11579 1726882199.72123: variable 'port1_profile' from source: play vars 11579 1726882199.72129: variable 'controller_profile' from source: play vars 11579 1726882199.72170: variable 'controller_profile' from source: play vars 11579 1726882199.72180: variable 'ansible_distribution' from source: facts 11579 1726882199.72187: variable '__network_rh_distros' from source: role '' defaults 11579 1726882199.72204: variable 'ansible_distribution_major_version' from source: facts 11579 1726882199.72222: variable '__network_packages_default_initscripts_network_scripts' from source: role '' defaults 11579 1726882199.72457: variable 'ansible_distribution' from source: facts 11579 1726882199.72460: variable '__network_rh_distros' from source: role '' defaults 11579 1726882199.72462: variable 'ansible_distribution_major_version' from source: facts 11579 1726882199.72464: variable '__network_packages_default_initscripts_dhcp_client' from source: role '' defaults 11579 1726882199.72614: variable 'ansible_distribution' from source: facts 11579 1726882199.72623: variable '__network_rh_distros' from source: role '' defaults 11579 1726882199.72633: variable 'ansible_distribution_major_version' from source: facts 11579 1726882199.72676: variable 'network_provider' from source: set_fact 11579 1726882199.72708: variable 'omit' from source: magic vars 11579 1726882199.72740: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882199.72773: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882199.72805: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882199.72891: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882199.72898: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882199.72901: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882199.72904: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882199.72906: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882199.73003: Set connection var ansible_timeout to 10 11579 1726882199.73016: Set connection var ansible_shell_type to sh 11579 1726882199.73028: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882199.73038: Set connection var ansible_shell_executable to /bin/sh 11579 1726882199.73050: Set connection var ansible_pipelining to False 11579 1726882199.73057: Set connection var ansible_connection to ssh 11579 1726882199.73083: variable 'ansible_shell_executable' from source: unknown 11579 1726882199.73091: variable 'ansible_connection' from source: unknown 11579 1726882199.73197: variable 'ansible_module_compression' from source: unknown 11579 1726882199.73202: variable 'ansible_shell_type' from source: unknown 11579 1726882199.73204: variable 'ansible_shell_executable' from source: unknown 11579 1726882199.73206: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882199.73208: variable 'ansible_pipelining' from source: unknown 11579 1726882199.73210: variable 'ansible_timeout' from source: unknown 11579 1726882199.73212: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882199.73256: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882199.73273: variable 'omit' from source: magic vars 11579 1726882199.73283: starting attempt loop 11579 1726882199.73291: running the handler 11579 1726882199.73376: variable 'ansible_facts' from source: unknown 11579 1726882199.74199: _low_level_execute_command(): starting 11579 1726882199.74202: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882199.74915: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882199.74965: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882199.74985: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882199.75015: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882199.75101: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882199.76791: stdout chunk (state=3): >>>/root <<< 11579 1726882199.76951: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882199.76955: stdout chunk (state=3): >>><<< 11579 1726882199.76957: stderr chunk (state=3): >>><<< 11579 1726882199.76977: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882199.77075: _low_level_execute_command(): starting 11579 1726882199.77079: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882199.769842-12944-161261447700800 `" && echo ansible-tmp-1726882199.769842-12944-161261447700800="` echo /root/.ansible/tmp/ansible-tmp-1726882199.769842-12944-161261447700800 `" ) && sleep 0' 11579 1726882199.77663: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882199.77677: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882199.77696: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882199.77714: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882199.77751: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882199.77765: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882199.77856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882199.77901: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882199.77937: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882199.79808: stdout chunk (state=3): >>>ansible-tmp-1726882199.769842-12944-161261447700800=/root/.ansible/tmp/ansible-tmp-1726882199.769842-12944-161261447700800 <<< 11579 1726882199.79955: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882199.79982: stdout chunk (state=3): >>><<< 11579 1726882199.79986: stderr chunk (state=3): >>><<< 11579 1726882199.80201: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882199.769842-12944-161261447700800=/root/.ansible/tmp/ansible-tmp-1726882199.769842-12944-161261447700800 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882199.80209: variable 'ansible_module_compression' from source: unknown 11579 1726882199.80211: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.systemd-ZIP_DEFLATED 11579 1726882199.80213: variable 'ansible_facts' from source: unknown 11579 1726882199.80396: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882199.769842-12944-161261447700800/AnsiballZ_systemd.py 11579 1726882199.80540: Sending initial data 11579 1726882199.80643: Sent initial data (155 bytes) 11579 1726882199.81189: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882199.81211: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882199.81299: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882199.81364: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882199.81367: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882199.81397: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882199.81456: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882199.82970: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11579 1726882199.82980: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882199.83047: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882199.83173: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmp650sklh4 /root/.ansible/tmp/ansible-tmp-1726882199.769842-12944-161261447700800/AnsiballZ_systemd.py <<< 11579 1726882199.83176: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882199.769842-12944-161261447700800/AnsiballZ_systemd.py" <<< 11579 1726882199.83246: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmp650sklh4" to remote "/root/.ansible/tmp/ansible-tmp-1726882199.769842-12944-161261447700800/AnsiballZ_systemd.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882199.769842-12944-161261447700800/AnsiballZ_systemd.py" <<< 11579 1726882199.84766: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882199.84918: stderr chunk (state=3): >>><<< 11579 1726882199.84922: stdout chunk (state=3): >>><<< 11579 1726882199.84925: done transferring module to remote 11579 1726882199.84927: _low_level_execute_command(): starting 11579 1726882199.84930: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882199.769842-12944-161261447700800/ /root/.ansible/tmp/ansible-tmp-1726882199.769842-12944-161261447700800/AnsiballZ_systemd.py && sleep 0' 11579 1726882199.85484: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882199.85552: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882199.85606: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882199.85619: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882199.85663: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882199.85706: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882199.87529: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882199.87551: stdout chunk (state=3): >>><<< 11579 1726882199.87554: stderr chunk (state=3): >>><<< 11579 1726882199.87646: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882199.87650: _low_level_execute_command(): starting 11579 1726882199.87653: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882199.769842-12944-161261447700800/AnsiballZ_systemd.py && sleep 0' 11579 1726882199.88201: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882199.88217: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882199.88232: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882199.88256: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882199.88278: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882199.88362: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882199.88395: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882199.88411: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882199.88436: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882199.88519: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882200.17356: stdout chunk (state=3): >>> {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10571776", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300200448", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "499621000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpRe<<< 11579 1726882200.17605: stdout chunk (state=3): >>>ceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} <<< 11579 1726882200.19273: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882200.19305: stdout chunk (state=3): >>><<< 11579 1726882200.19316: stderr chunk (state=3): >>><<< 11579 1726882200.19338: _low_level_execute_command() done: rc=0, stdout= {"name": "NetworkManager", "changed": false, "status": {"Type": "dbus", "ExitType": "main", "Restart": "on-failure", "RestartMode": "normal", "NotifyAccess": "none", "RestartUSec": "100ms", "RestartSteps": "0", "RestartMaxDelayUSec": "infinity", "RestartUSecNext": "100ms", "TimeoutStartUSec": "10min", "TimeoutStopUSec": "1min 30s", "TimeoutAbortUSec": "1min 30s", "TimeoutStartFailureMode": "terminate", "TimeoutStopFailureMode": "terminate", "RuntimeMaxUSec": "infinity", "RuntimeRandomizedExtraUSec": "0", "WatchdogUSec": "0", "WatchdogTimestampMonotonic": "0", "RootDirectoryStartOnly": "no", "RemainAfterExit": "no", "GuessMainPID": "yes", "MainPID": "701", "ControlPID": "0", "BusName": "org.freedesktop.NetworkManager", "FileDescriptorStoreMax": "0", "NFileDescriptorStore": "0", "FileDescriptorStorePreserve": "restart", "StatusErrno": "0", "Result": "success", "ReloadResult": "success", "CleanResult": "success", "UID": "[not set]", "GID": "[not set]", "NRestarts": "0", "OOMPolicy": "stop", "ReloadSignal": "1", "ExecMainStartTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainStartTimestampMonotonic": "18353430", "ExecMainExitTimestampMonotonic": "0", "ExecMainHandoffTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ExecMainHandoffTimestampMonotonic": "18368765", "ExecMainPID": "701", "ExecMainCode": "0", "ExecMainStatus": "0", "ExecStart": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecStartEx": "{ path=/usr/sbin/NetworkManager ; argv[]=/usr/sbin/NetworkManager --no-daemon ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReload": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; ignore_errors=no ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "ExecReloadEx": "{ path=/usr/bin/busctl ; argv[]=/usr/bin/busctl call org.freedesktop.NetworkManager /org/freedesktop/NetworkManager org.freedesktop.NetworkManager Reload u 0 ; flags= ; start_time=[n/a] ; stop_time=[n/a] ; pid=0 ; code=(null) ; status=0/0 }", "Slice": "system.slice", "ControlGroup": "/system.slice/NetworkManager.service", "ControlGroupId": "2938", "MemoryCurrent": "10571776", "MemoryPeak": "14331904", "MemorySwapCurrent": "0", "MemorySwapPeak": "0", "MemoryZSwapCurrent": "0", "MemoryAvailable": "3300200448", "EffectiveMemoryMax": "3702886400", "EffectiveMemoryHigh": "3702886400", "CPUUsageNSec": "499621000", "TasksCurrent": "4", "EffectiveTasksMax": "22362", "IPIngressBytes": "[no data]", "IPIngressPackets": "[no data]", "IPEgressBytes": "[no data]", "IPEgressPackets": "[no data]", "IOReadBytes": "[not set]", "IOReadOperations": "[not set]", "IOWriteBytes": "[not set]", "IOWriteOperations": "[not set]", "Delegate": "no", "CPUAccounting": "yes", "CPUWeight": "[not set]", "StartupCPUWeight": "[not set]", "CPUShares": "[not set]", "StartupCPUShares": "[not set]", "CPUQuotaPerSecUSec": "infinity", "CPUQuotaPeriodUSec": "infinity", "IOAccounting": "no", "IOWeight": "[not set]", "StartupIOWeight": "[not set]", "BlockIOAccounting": "no", "BlockIOWeight": "[not set]", "StartupBlockIOWeight": "[not set]", "MemoryAccounting": "yes", "DefaultMemoryLow": "0", "DefaultStartupMemoryLow": "0", "DefaultMemoryMin": "0", "MemoryMin": "0", "MemoryLow": "0", "StartupMemoryLow": "0", "MemoryHigh": "infinity", "StartupMemoryHigh": "infinity", "MemoryMax": "infinity", "StartupMemoryMax": "infinity", "MemorySwapMax": "infinity", "StartupMemorySwapMax": "infinity", "MemoryZSwapMax": "infinity", "StartupMemoryZSwapMax": "infinity", "MemoryZSwapWriteback": "yes", "MemoryLimit": "infinity", "DevicePolicy": "auto", "TasksAccounting": "yes", "TasksMax": "22362", "IPAccounting": "no", "ManagedOOMSwap": "auto", "ManagedOOMMemoryPressure": "auto", "ManagedOOMMemoryPressureLimit": "0", "ManagedOOMPreference": "none", "MemoryPressureWatch": "auto", "MemoryPressureThresholdUSec": "200ms", "CoredumpReceive": "no", "UMask": "0022", "LimitCPU": "infinity", "LimitCPUSoft": "infinity", "LimitFSIZE": "infinity", "LimitFSIZESoft": "infinity", "LimitDATA": "infinity", "LimitDATASoft": "infinity", "LimitSTACK": "infinity", "LimitSTACKSoft": "8388608", "LimitCORE": "infinity", "LimitCORESoft": "infinity", "LimitRSS": "infinity", "LimitRSSSoft": "infinity", "LimitNOFILE": "65536", "LimitNOFILESoft": "65536", "LimitAS": "infinity", "LimitASSoft": "infinity", "LimitNPROC": "13976", "LimitNPROCSoft": "13976", "LimitMEMLOCK": "8388608", "LimitMEMLOCKSoft": "8388608", "LimitLOCKS": "infinity", "LimitLOCKSSoft": "infinity", "LimitSIGPENDING": "13976", "LimitSIGPENDINGSoft": "13976", "LimitMSGQUEUE": "819200", "LimitMSGQUEUESoft": "819200", "LimitNICE": "0", "LimitNICESoft": "0", "LimitRTPRIO": "0", "LimitRTPRIOSoft": "0", "LimitRTTIME": "infinity", "LimitRTTIMESoft": "infinity", "RootEphemeral": "no", "OOMScoreAdjust": "0", "CoredumpFilter": "0x33", "Nice": "0", "IOSchedulingClass": "2", "IOSchedulingPriority": "4", "CPUSchedulingPolicy": "0", "CPUSchedulingPriority": "0", "CPUAffinityFromNUMA": "no", "NUMAPolicy": "n/a", "TimerSlackNSec": "50000", "CPUSchedulingResetOnFork": "no", "NonBlocking": "no", "StandardInput": "null", "StandardOutput": "journal", "StandardError": "inherit", "TTYReset": "no", "TTYVHangup": "no", "TTYVTDisallocate": "no", "SyslogPriority": "30", "SyslogLevelPrefix": "yes", "SyslogLevel": "6", "SyslogFacility": "3", "LogLevelMax": "-1", "LogRateLimitIntervalUSec": "0", "LogRateLimitBurst": "0", "SecureBits": "0", "CapabilityBoundingSet": "cap_dac_override cap_kill cap_setgid cap_setuid cap_net_bind_service cap_net_admin cap_net_raw cap_sys_module cap_sys_chroot cap_audit_write", "DynamicUser": "no", "SetLoginEnvironment": "no", "RemoveIPC": "no", "PrivateTmp": "no", "PrivateDevices": "no", "ProtectClock": "no", "ProtectKernelTunables": "no", "ProtectKernelModules": "no", "ProtectKernelLogs": "no", "ProtectControlGroups": "no", "PrivateNetwork": "no", "PrivateUsers": "no", "PrivateMounts": "no", "PrivateIPC": "no", "ProtectHome": "read-only", "ProtectSystem": "yes", "SameProcessGroup": "no", "UtmpMode": "init", "IgnoreSIGPIPE": "yes", "NoNewPrivileges": "no", "SystemCallErrorNumber": "2147483646", "LockPersonality": "no", "RuntimeDirectoryPreserve": "no", "RuntimeDirectoryMode": "0755", "StateDirectoryMode": "0755", "CacheDirectoryMode": "0755", "LogsDirectoryMode": "0755", "ConfigurationDirectoryMode": "0755", "TimeoutCleanUSec": "infinity", "MemoryDenyWriteExecute": "no", "RestrictRealtime": "no", "RestrictSUIDSGID": "no", "RestrictNamespaces": "no", "MountAPIVFS": "no", "KeyringMode": "private", "ProtectProc": "default", "ProcSubset": "all", "ProtectHostname": "no", "MemoryKSM": "no", "RootImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "MountImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "ExtensionImagePolicy": "root=verity+signed+encrypted+unprotected+absent:usr=verity+signed+encrypted+unprotected+absent:home=encrypted+unprotected+absent:srv=encrypted+unprotected+absent:tmp=encrypted+unprotected+absent:var=encrypted+unprotected+absent", "KillMode": "process", "KillSignal": "15", "RestartKillSignal": "15", "FinalKillSignal": "9", "SendSIGKILL": "yes", "SendSIGHUP": "no", "WatchdogSignal": "6", "Id": "NetworkManager.service", "Names": "NetworkManager.service", "Requires": "dbus.socket system.slice sysinit.target", "Wants": "network.target", "BindsTo": "dbus-broker.service", "RequiredBy": "NetworkManager-wait-online.service", "WantedBy": "multi-user.target", "Conflicts": "shutdown.target", "Before": "multi-user.target NetworkManager-wait-online.service network.target cloud-init.service shutdown.target", "After": "basic.target system.slice sysinit.target systemd-journald.socket network-pre.target dbus-broker.service dbus.socket cloud-init-local.service", "Documentation": "\"man:NetworkManager(8)\"", "Description": "Network Manager", "AccessSELinuxContext": "system_u:object_r:NetworkManager_unit_file_t:s0", "LoadState": "loaded", "ActiveState": "active", "FreezerState": "running", "SubState": "running", "FragmentPath": "/usr/lib/systemd/system/NetworkManager.service", "UnitFileState": "enabled", "UnitFilePreset": "enabled", "StateChangeTimestamp": "Fri 2024-09-20 21:29:37 EDT", "StateChangeTimestampMonotonic": "610814281", "InactiveExitTimestamp": "Fri 2024-09-20 21:19:45 EDT", "InactiveExitTimestampMonotonic": "18353817", "ActiveEnterTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ActiveEnterTimestampMonotonic": "18664782", "ActiveExitTimestampMonotonic": "0", "InactiveEnterTimestampMonotonic": "0", "CanStart": "yes", "CanStop": "yes", "CanReload": "yes", "CanIsolate": "no", "CanFreeze": "yes", "StopWhenUnneeded": "no", "RefuseManualStart": "no", "RefuseManualStop": "no", "AllowIsolate": "no", "DefaultDependencies": "yes", "SurviveFinalKillSignal": "no", "OnSuccessJobMode": "fail", "OnFailureJobMode": "replace", "IgnoreOnIsolate": "no", "NeedDaemonReload": "no", "JobTimeoutUSec": "infinity", "JobRunningTimeoutUSec": "infinity", "JobTimeoutAction": "none", "ConditionResult": "yes", "AssertResult": "yes", "ConditionTimestamp": "Fri 2024-09-20 21:19:45 EDT", "ConditionTimestampMonotonic": "18352589", "AssertTimestamp": "Fri 2024-09-20 21:19:45 EDT", "AssertTimestampMonotonic": "18352592", "Transient": "no", "Perpetual": "no", "StartLimitIntervalUSec": "10s", "StartLimitBurst": "5", "StartLimitAction": "none", "FailureAction": "none", "SuccessAction": "none", "InvocationID": "ccc4619c603e4305b3d5044f460b1d5b", "CollectMode": "inactive"}, "enabled": true, "state": "started", "invocation": {"module_args": {"name": "NetworkManager", "state": "started", "enabled": true, "daemon_reload": false, "daemon_reexec": false, "scope": "system", "no_block": false, "force": null, "masked": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882200.19686: done with _execute_module (ansible.legacy.systemd, {'name': 'NetworkManager', 'state': 'started', 'enabled': True, '_ansible_check_mode': False, '_ansible_no_log': True, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.systemd', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882199.769842-12944-161261447700800/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882200.19759: _low_level_execute_command(): starting 11579 1726882200.19770: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882199.769842-12944-161261447700800/ > /dev/null 2>&1 && sleep 0' 11579 1726882200.20374: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882200.20390: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882200.20413: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882200.20513: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882200.20529: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882200.20553: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882200.20571: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882200.20669: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882200.22511: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882200.22529: stderr chunk (state=3): >>><<< 11579 1726882200.22539: stdout chunk (state=3): >>><<< 11579 1726882200.22555: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882200.22562: handler run complete 11579 1726882200.22635: attempt loop complete, returning result 11579 1726882200.22639: _execute() done 11579 1726882200.22641: dumping result to json 11579 1726882200.22661: done dumping result, returning 11579 1726882200.22671: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start NetworkManager [12673a56-9f93-f197-7423-000000000088] 11579 1726882200.22675: sending task result for task 12673a56-9f93-f197-7423-000000000088 ok: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11579 1726882200.23042: no more pending results, returning what we have 11579 1726882200.23046: results queue empty 11579 1726882200.23046: checking for any_errors_fatal 11579 1726882200.23054: done checking for any_errors_fatal 11579 1726882200.23055: checking for max_fail_percentage 11579 1726882200.23057: done checking for max_fail_percentage 11579 1726882200.23058: checking to see if all hosts have failed and the running result is not ok 11579 1726882200.23059: done checking to see if all hosts have failed 11579 1726882200.23059: getting the remaining hosts for this loop 11579 1726882200.23061: done getting the remaining hosts for this loop 11579 1726882200.23063: getting the next task for host managed_node1 11579 1726882200.23071: done getting next task for host managed_node1 11579 1726882200.23075: ^ task is: TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11579 1726882200.23080: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=17, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882200.23096: getting variables 11579 1726882200.23099: in VariableManager get_vars() 11579 1726882200.23139: Calling all_inventory to load vars for managed_node1 11579 1726882200.23142: Calling groups_inventory to load vars for managed_node1 11579 1726882200.23144: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882200.23155: Calling all_plugins_play to load vars for managed_node1 11579 1726882200.23159: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882200.23162: Calling groups_plugins_play to load vars for managed_node1 11579 1726882200.23705: done sending task result for task 12673a56-9f93-f197-7423-000000000088 11579 1726882200.23709: WORKER PROCESS EXITING 11579 1726882200.24679: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882200.26157: done with get_vars() 11579 1726882200.26183: done getting variables 11579 1726882200.26245: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable and start wpa_supplicant] ***** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:133 Friday 20 September 2024 21:30:00 -0400 (0:00:00.647) 0:00:28.971 ****** 11579 1726882200.26282: entering _queue_task() for managed_node1/service 11579 1726882200.26832: worker is 1 (out of 1 available) 11579 1726882200.26843: exiting _queue_task() for managed_node1/service 11579 1726882200.26854: done queuing things up, now waiting for results queue to drain 11579 1726882200.26855: waiting for pending results... 11579 1726882200.26985: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant 11579 1726882200.27191: in run() - task 12673a56-9f93-f197-7423-000000000089 11579 1726882200.27196: variable 'ansible_search_path' from source: unknown 11579 1726882200.27199: variable 'ansible_search_path' from source: unknown 11579 1726882200.27201: calling self._execute() 11579 1726882200.27311: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882200.27323: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882200.27338: variable 'omit' from source: magic vars 11579 1726882200.27720: variable 'ansible_distribution_major_version' from source: facts 11579 1726882200.27745: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882200.27866: variable 'network_provider' from source: set_fact 11579 1726882200.27878: Evaluated conditional (network_provider == "nm"): True 11579 1726882200.27977: variable '__network_wpa_supplicant_required' from source: role '' defaults 11579 1726882200.28171: variable '__network_ieee802_1x_connections_defined' from source: role '' defaults 11579 1726882200.28245: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882200.30630: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882200.30695: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882200.30732: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882200.30768: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882200.30804: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882200.30891: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882200.30928: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882200.30958: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882200.31010: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882200.31029: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882200.31071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882200.31103: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882200.31130: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882200.31173: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882200.31191: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882200.31241: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882200.31316: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882200.31318: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882200.31336: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882200.31351: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882200.31490: variable 'network_connections' from source: task vars 11579 1726882200.31509: variable 'port2_profile' from source: play vars 11579 1726882200.31580: variable 'port2_profile' from source: play vars 11579 1726882200.31597: variable 'port1_profile' from source: play vars 11579 1726882200.31665: variable 'port1_profile' from source: play vars 11579 1726882200.31898: variable 'controller_profile' from source: play vars 11579 1726882200.31901: variable 'controller_profile' from source: play vars 11579 1726882200.31903: '/usr/local/lib/python3.12/site-packages/ansible/plugins/test/__init__' skipped due to reserved name 11579 1726882200.31974: Loading TestModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py 11579 1726882200.32024: Loading TestModule 'files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py 11579 1726882200.32060: Loading TestModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py 11579 1726882200.32096: Loading TestModule 'uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py 11579 1726882200.32148: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/core.py (found_in_cache=True, class_only=False) 11579 1726882200.32176: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.files' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/files.py (found_in_cache=True, class_only=False) 11579 1726882200.32211: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882200.32247: Loading TestModule 'ansible_collections.ansible.builtin.plugins.test.uri' from /usr/local/lib/python3.12/site-packages/ansible/plugins/test/uri.py (found_in_cache=True, class_only=False) 11579 1726882200.32306: variable '__network_wireless_connections_defined' from source: role '' defaults 11579 1726882200.32558: variable 'network_connections' from source: task vars 11579 1726882200.32573: variable 'port2_profile' from source: play vars 11579 1726882200.32642: variable 'port2_profile' from source: play vars 11579 1726882200.32655: variable 'port1_profile' from source: play vars 11579 1726882200.32723: variable 'port1_profile' from source: play vars 11579 1726882200.32743: variable 'controller_profile' from source: play vars 11579 1726882200.32812: variable 'controller_profile' from source: play vars 11579 1726882200.32867: Evaluated conditional (__network_wpa_supplicant_required): False 11579 1726882200.32876: when evaluation is False, skipping this task 11579 1726882200.32884: _execute() done 11579 1726882200.32898: dumping result to json 11579 1726882200.32907: done dumping result, returning 11579 1726882200.32919: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable and start wpa_supplicant [12673a56-9f93-f197-7423-000000000089] 11579 1726882200.32928: sending task result for task 12673a56-9f93-f197-7423-000000000089 11579 1726882200.33216: done sending task result for task 12673a56-9f93-f197-7423-000000000089 11579 1726882200.33219: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "__network_wpa_supplicant_required", "skip_reason": "Conditional result was False" } 11579 1726882200.33267: no more pending results, returning what we have 11579 1726882200.33271: results queue empty 11579 1726882200.33272: checking for any_errors_fatal 11579 1726882200.33296: done checking for any_errors_fatal 11579 1726882200.33297: checking for max_fail_percentage 11579 1726882200.33299: done checking for max_fail_percentage 11579 1726882200.33300: checking to see if all hosts have failed and the running result is not ok 11579 1726882200.33301: done checking to see if all hosts have failed 11579 1726882200.33302: getting the remaining hosts for this loop 11579 1726882200.33303: done getting the remaining hosts for this loop 11579 1726882200.33307: getting the next task for host managed_node1 11579 1726882200.33314: done getting next task for host managed_node1 11579 1726882200.33318: ^ task is: TASK: fedora.linux_system_roles.network : Enable network service 11579 1726882200.33321: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=18, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882200.33338: getting variables 11579 1726882200.33340: in VariableManager get_vars() 11579 1726882200.33383: Calling all_inventory to load vars for managed_node1 11579 1726882200.33386: Calling groups_inventory to load vars for managed_node1 11579 1726882200.33388: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882200.33412: Calling all_plugins_play to load vars for managed_node1 11579 1726882200.33416: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882200.33420: Calling groups_plugins_play to load vars for managed_node1 11579 1726882200.35044: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882200.37100: done with get_vars() 11579 1726882200.37124: done getting variables 11579 1726882200.37184: Loading ActionModule 'service' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/service.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Enable network service] ************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:142 Friday 20 September 2024 21:30:00 -0400 (0:00:00.110) 0:00:29.081 ****** 11579 1726882200.37291: entering _queue_task() for managed_node1/service 11579 1726882200.37726: worker is 1 (out of 1 available) 11579 1726882200.37739: exiting _queue_task() for managed_node1/service 11579 1726882200.37750: done queuing things up, now waiting for results queue to drain 11579 1726882200.37751: waiting for pending results... 11579 1726882200.38212: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service 11579 1726882200.38217: in run() - task 12673a56-9f93-f197-7423-00000000008a 11579 1726882200.38220: variable 'ansible_search_path' from source: unknown 11579 1726882200.38223: variable 'ansible_search_path' from source: unknown 11579 1726882200.38251: calling self._execute() 11579 1726882200.38360: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882200.38372: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882200.38388: variable 'omit' from source: magic vars 11579 1726882200.38775: variable 'ansible_distribution_major_version' from source: facts 11579 1726882200.38796: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882200.38918: variable 'network_provider' from source: set_fact 11579 1726882200.38930: Evaluated conditional (network_provider == "initscripts"): False 11579 1726882200.38937: when evaluation is False, skipping this task 11579 1726882200.38945: _execute() done 11579 1726882200.38952: dumping result to json 11579 1726882200.38960: done dumping result, returning 11579 1726882200.38971: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Enable network service [12673a56-9f93-f197-7423-00000000008a] 11579 1726882200.38983: sending task result for task 12673a56-9f93-f197-7423-00000000008a skipping: [managed_node1] => { "censored": "the output has been hidden due to the fact that 'no_log: true' was specified for this result", "changed": false } 11579 1726882200.39138: no more pending results, returning what we have 11579 1726882200.39143: results queue empty 11579 1726882200.39144: checking for any_errors_fatal 11579 1726882200.39151: done checking for any_errors_fatal 11579 1726882200.39152: checking for max_fail_percentage 11579 1726882200.39154: done checking for max_fail_percentage 11579 1726882200.39155: checking to see if all hosts have failed and the running result is not ok 11579 1726882200.39156: done checking to see if all hosts have failed 11579 1726882200.39157: getting the remaining hosts for this loop 11579 1726882200.39158: done getting the remaining hosts for this loop 11579 1726882200.39161: getting the next task for host managed_node1 11579 1726882200.39170: done getting next task for host managed_node1 11579 1726882200.39174: ^ task is: TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11579 1726882200.39178: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=19, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882200.39202: getting variables 11579 1726882200.39204: in VariableManager get_vars() 11579 1726882200.39249: Calling all_inventory to load vars for managed_node1 11579 1726882200.39252: Calling groups_inventory to load vars for managed_node1 11579 1726882200.39255: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882200.39269: Calling all_plugins_play to load vars for managed_node1 11579 1726882200.39272: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882200.39275: Calling groups_plugins_play to load vars for managed_node1 11579 1726882200.39907: done sending task result for task 12673a56-9f93-f197-7423-00000000008a 11579 1726882200.39910: WORKER PROCESS EXITING 11579 1726882200.40837: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882200.42352: done with get_vars() 11579 1726882200.42386: done getting variables 11579 1726882200.42453: Loading ActionModule 'copy' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/copy.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Ensure initscripts network file dependency is present] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:150 Friday 20 September 2024 21:30:00 -0400 (0:00:00.051) 0:00:29.133 ****** 11579 1726882200.42490: entering _queue_task() for managed_node1/copy 11579 1726882200.42925: worker is 1 (out of 1 available) 11579 1726882200.42937: exiting _queue_task() for managed_node1/copy 11579 1726882200.42947: done queuing things up, now waiting for results queue to drain 11579 1726882200.42949: waiting for pending results... 11579 1726882200.43141: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present 11579 1726882200.43306: in run() - task 12673a56-9f93-f197-7423-00000000008b 11579 1726882200.43327: variable 'ansible_search_path' from source: unknown 11579 1726882200.43337: variable 'ansible_search_path' from source: unknown 11579 1726882200.43380: calling self._execute() 11579 1726882200.43478: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882200.43489: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882200.43509: variable 'omit' from source: magic vars 11579 1726882200.44099: variable 'ansible_distribution_major_version' from source: facts 11579 1726882200.44102: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882200.44106: variable 'network_provider' from source: set_fact 11579 1726882200.44109: Evaluated conditional (network_provider == "initscripts"): False 11579 1726882200.44111: when evaluation is False, skipping this task 11579 1726882200.44114: _execute() done 11579 1726882200.44116: dumping result to json 11579 1726882200.44118: done dumping result, returning 11579 1726882200.44121: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Ensure initscripts network file dependency is present [12673a56-9f93-f197-7423-00000000008b] 11579 1726882200.44124: sending task result for task 12673a56-9f93-f197-7423-00000000008b 11579 1726882200.44202: done sending task result for task 12673a56-9f93-f197-7423-00000000008b 11579 1726882200.44206: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11579 1726882200.44255: no more pending results, returning what we have 11579 1726882200.44259: results queue empty 11579 1726882200.44260: checking for any_errors_fatal 11579 1726882200.44265: done checking for any_errors_fatal 11579 1726882200.44266: checking for max_fail_percentage 11579 1726882200.44267: done checking for max_fail_percentage 11579 1726882200.44268: checking to see if all hosts have failed and the running result is not ok 11579 1726882200.44269: done checking to see if all hosts have failed 11579 1726882200.44270: getting the remaining hosts for this loop 11579 1726882200.44272: done getting the remaining hosts for this loop 11579 1726882200.44275: getting the next task for host managed_node1 11579 1726882200.44284: done getting next task for host managed_node1 11579 1726882200.44288: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11579 1726882200.44295: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=20, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882200.44316: getting variables 11579 1726882200.44318: in VariableManager get_vars() 11579 1726882200.44365: Calling all_inventory to load vars for managed_node1 11579 1726882200.44368: Calling groups_inventory to load vars for managed_node1 11579 1726882200.44371: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882200.44384: Calling all_plugins_play to load vars for managed_node1 11579 1726882200.44387: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882200.44391: Calling groups_plugins_play to load vars for managed_node1 11579 1726882200.46101: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882200.47604: done with get_vars() 11579 1726882200.47630: done getting variables TASK [fedora.linux_system_roles.network : Configure networking connection profiles] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 Friday 20 September 2024 21:30:00 -0400 (0:00:00.052) 0:00:29.185 ****** 11579 1726882200.47721: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11579 1726882200.48060: worker is 1 (out of 1 available) 11579 1726882200.48074: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_connections 11579 1726882200.48086: done queuing things up, now waiting for results queue to drain 11579 1726882200.48088: waiting for pending results... 11579 1726882200.48509: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles 11579 1726882200.48514: in run() - task 12673a56-9f93-f197-7423-00000000008c 11579 1726882200.48548: variable 'ansible_search_path' from source: unknown 11579 1726882200.48556: variable 'ansible_search_path' from source: unknown 11579 1726882200.48600: calling self._execute() 11579 1726882200.48705: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882200.48717: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882200.48737: variable 'omit' from source: magic vars 11579 1726882200.49121: variable 'ansible_distribution_major_version' from source: facts 11579 1726882200.49139: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882200.49149: variable 'omit' from source: magic vars 11579 1726882200.49221: variable 'omit' from source: magic vars 11579 1726882200.49377: '/usr/local/lib/python3.12/site-packages/ansible/plugins/filter/__init__' skipped due to reserved name 11579 1726882200.53086: Loading FilterModule 'core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py 11579 1726882200.53091: Loading FilterModule 'encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py 11579 1726882200.53129: Loading FilterModule 'mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py 11579 1726882200.53172: Loading FilterModule 'urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py 11579 1726882200.53410: Loading FilterModule 'urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py 11579 1726882200.53517: variable 'network_provider' from source: set_fact 11579 1726882200.53850: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.core' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/core.py (found_in_cache=True, class_only=False) 11579 1726882200.53868: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.encryption' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/encryption.py (found_in_cache=True, class_only=False) 11579 1726882200.53927: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.mathstuff' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/mathstuff.py (found_in_cache=True, class_only=False) 11579 1726882200.54071: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urls' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urls.py (found_in_cache=True, class_only=False) 11579 1726882200.54098: Loading FilterModule 'ansible_collections.ansible.builtin.plugins.filter.urlsplit' from /usr/local/lib/python3.12/site-packages/ansible/plugins/filter/urlsplit.py (found_in_cache=True, class_only=False) 11579 1726882200.54298: variable 'omit' from source: magic vars 11579 1726882200.54459: variable 'omit' from source: magic vars 11579 1726882200.54567: variable 'network_connections' from source: task vars 11579 1726882200.54579: variable 'port2_profile' from source: play vars 11579 1726882200.54651: variable 'port2_profile' from source: play vars 11579 1726882200.54659: variable 'port1_profile' from source: play vars 11579 1726882200.54726: variable 'port1_profile' from source: play vars 11579 1726882200.54734: variable 'controller_profile' from source: play vars 11579 1726882200.54788: variable 'controller_profile' from source: play vars 11579 1726882200.54947: variable 'omit' from source: magic vars 11579 1726882200.54954: variable '__lsr_ansible_managed' from source: task vars 11579 1726882200.55012: variable '__lsr_ansible_managed' from source: task vars 11579 1726882200.55188: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup 11579 1726882200.55403: Loaded config def from plugin (lookup/template) 11579 1726882200.55406: Loading LookupModule 'template' from /usr/local/lib/python3.12/site-packages/ansible/plugins/lookup/template.py 11579 1726882200.55430: File lookup term: get_ansible_managed.j2 11579 1726882200.55433: variable 'ansible_search_path' from source: unknown 11579 1726882200.55437: evaluation_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks 11579 1726882200.55479: search_path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/templates/get_ansible_managed.j2 /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/get_ansible_managed.j2 11579 1726882200.55483: variable 'ansible_search_path' from source: unknown 11579 1726882200.62204: variable 'ansible_managed' from source: unknown 11579 1726882200.62244: variable 'omit' from source: magic vars 11579 1726882200.62269: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882200.62292: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882200.62314: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882200.62331: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882200.62341: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882200.62527: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882200.62531: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882200.62533: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882200.62567: Set connection var ansible_timeout to 10 11579 1726882200.62573: Set connection var ansible_shell_type to sh 11579 1726882200.62581: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882200.62586: Set connection var ansible_shell_executable to /bin/sh 11579 1726882200.62595: Set connection var ansible_pipelining to False 11579 1726882200.62606: Set connection var ansible_connection to ssh 11579 1726882200.62632: variable 'ansible_shell_executable' from source: unknown 11579 1726882200.62636: variable 'ansible_connection' from source: unknown 11579 1726882200.62638: variable 'ansible_module_compression' from source: unknown 11579 1726882200.62641: variable 'ansible_shell_type' from source: unknown 11579 1726882200.62643: variable 'ansible_shell_executable' from source: unknown 11579 1726882200.62645: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882200.62647: variable 'ansible_pipelining' from source: unknown 11579 1726882200.62649: variable 'ansible_timeout' from source: unknown 11579 1726882200.62658: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882200.62882: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11579 1726882200.62886: variable 'omit' from source: magic vars 11579 1726882200.62888: starting attempt loop 11579 1726882200.62891: running the handler 11579 1726882200.62894: _low_level_execute_command(): starting 11579 1726882200.62897: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882200.63683: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882200.63802: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882200.63819: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882200.65504: stdout chunk (state=3): >>>/root <<< 11579 1726882200.65719: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882200.65722: stdout chunk (state=3): >>><<< 11579 1726882200.65725: stderr chunk (state=3): >>><<< 11579 1726882200.65728: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882200.65730: _low_level_execute_command(): starting 11579 1726882200.65733: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882200.6567075-12984-50670380119683 `" && echo ansible-tmp-1726882200.6567075-12984-50670380119683="` echo /root/.ansible/tmp/ansible-tmp-1726882200.6567075-12984-50670380119683 `" ) && sleep 0' 11579 1726882200.66273: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882200.66301: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882200.66304: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882200.66394: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882200.66398: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882200.66401: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882200.66403: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882200.66406: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882200.66408: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 11579 1726882200.66410: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11579 1726882200.66412: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882200.66414: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882200.66416: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882200.66418: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882200.66421: stderr chunk (state=3): >>>debug2: match found <<< 11579 1726882200.66423: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882200.66506: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882200.66510: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882200.66532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882200.66610: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882200.68470: stdout chunk (state=3): >>>ansible-tmp-1726882200.6567075-12984-50670380119683=/root/.ansible/tmp/ansible-tmp-1726882200.6567075-12984-50670380119683 <<< 11579 1726882200.68645: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882200.68648: stdout chunk (state=3): >>><<< 11579 1726882200.68650: stderr chunk (state=3): >>><<< 11579 1726882200.68674: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882200.6567075-12984-50670380119683=/root/.ansible/tmp/ansible-tmp-1726882200.6567075-12984-50670380119683 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882200.68732: variable 'ansible_module_compression' from source: unknown 11579 1726882200.68788: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible_collections.fedora.linux_system_roles.plugins.modules.network_connections-ZIP_DEFLATED 11579 1726882200.68837: variable 'ansible_facts' from source: unknown 11579 1726882200.68966: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882200.6567075-12984-50670380119683/AnsiballZ_network_connections.py 11579 1726882200.69132: Sending initial data 11579 1726882200.69135: Sent initial data (167 bytes) 11579 1726882200.70014: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882200.70384: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882200.70409: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882200.70448: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882200.70638: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882200.72065: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882200.72129: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882200.72205: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpoxj7zy48 /root/.ansible/tmp/ansible-tmp-1726882200.6567075-12984-50670380119683/AnsiballZ_network_connections.py <<< 11579 1726882200.72230: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882200.6567075-12984-50670380119683/AnsiballZ_network_connections.py" <<< 11579 1726882200.72278: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpoxj7zy48" to remote "/root/.ansible/tmp/ansible-tmp-1726882200.6567075-12984-50670380119683/AnsiballZ_network_connections.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882200.6567075-12984-50670380119683/AnsiballZ_network_connections.py" <<< 11579 1726882200.73858: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882200.73862: stderr chunk (state=3): >>><<< 11579 1726882200.73865: stdout chunk (state=3): >>><<< 11579 1726882200.73867: done transferring module to remote 11579 1726882200.73869: _low_level_execute_command(): starting 11579 1726882200.73872: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882200.6567075-12984-50670380119683/ /root/.ansible/tmp/ansible-tmp-1726882200.6567075-12984-50670380119683/AnsiballZ_network_connections.py && sleep 0' 11579 1726882200.74655: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882200.74669: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882200.74682: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882200.74702: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882200.74766: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882200.74810: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882200.74843: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882200.75042: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882200.76753: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882200.76811: stderr chunk (state=3): >>><<< 11579 1726882200.76815: stdout chunk (state=3): >>><<< 11579 1726882200.76835: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882200.76922: _low_level_execute_command(): starting 11579 1726882200.76926: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882200.6567075-12984-50670380119683/AnsiballZ_network_connections.py && sleep 0' 11579 1726882200.77491: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882200.77510: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882200.77526: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882200.77554: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882200.77572: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882200.77610: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882200.77695: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882200.77758: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882200.77779: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882200.77975: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882201.28870: stdout chunk (state=3): >>>Traceback (most recent call last): <<< 11579 1726882201.28898: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jb3gn860/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jb3gn860/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/29d61a64-4b27-4cf7-b22f-65c039402cbb: error=unknown <<< 11579 1726882201.30643: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jb3gn860/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back <<< 11579 1726882201.30648: stdout chunk (state=3): >>> File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jb3gn860/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/f55bc258-3187-4e12-b27d-0cad9097ebf7: error=unknown <<< 11579 1726882201.32390: stdout chunk (state=3): >>>Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jb3gn860/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jb3gn860/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/93b53e62-64f5-4c39-966c-031f30f8befe: error=unknown <<< 11579 1726882201.32522: stdout chunk (state=3): >>> {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} <<< 11579 1726882201.34808: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882201.34812: stdout chunk (state=3): >>><<< 11579 1726882201.34814: stderr chunk (state=3): >>><<< 11579 1726882201.34876: _low_level_execute_command() done: rc=0, stdout=Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jb3gn860/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jb3gn860/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.1/29d61a64-4b27-4cf7-b22f-65c039402cbb: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jb3gn860/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jb3gn860/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0.0/f55bc258-3187-4e12-b27d-0cad9097ebf7: error=unknown Traceback (most recent call last): File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jb3gn860/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/connection.py", line 113, in _nm_profile_volatile_update2_call_back File "/tmp/ansible_fedora.linux_system_roles.network_connections_payload_jb3gn860/ansible_fedora.linux_system_roles.network_connections_payload.zip/ansible_collections/fedora/linux_system_roles/plugins/module_utils/network_lsr/nm/client.py", line 102, in fail ansible_collections.fedora.linux_system_roles.plugins.module_utils.network_lsr.nm.error.LsrNetworkNmError: Connection volatilize aborted on bond0/93b53e62-64f5-4c39-966c-031f30f8befe: error=unknown {"changed": true, "warnings": [], "stderr": "\n", "_invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}, "invocation": {"module_args": {"provider": "nm", "connections": [{"name": "bond0.1", "persistent_state": "absent", "state": "down"}, {"name": "bond0.0", "persistent_state": "absent", "state": "down"}, {"name": "bond0", "persistent_state": "absent", "state": "down"}], "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "ignore_errors": false, "force_state_change": false, "__debug_flags": ""}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882201.34925: done with _execute_module (fedora.linux_system_roles.network_connections, {'provider': 'nm', 'connections': [{'name': 'bond0.1', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0.0', 'persistent_state': 'absent', 'state': 'down'}, {'name': 'bond0', 'persistent_state': 'absent', 'state': 'down'}], '__header': '#\n# Ansible managed\n#\n# system_role:network\n', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'fedora.linux_system_roles.network_connections', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882200.6567075-12984-50670380119683/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882201.34944: _low_level_execute_command(): starting 11579 1726882201.34947: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882200.6567075-12984-50670380119683/ > /dev/null 2>&1 && sleep 0' 11579 1726882201.36486: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882201.36491: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882201.36498: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882201.36596: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882201.36624: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882201.36701: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882201.38606: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882201.38611: stderr chunk (state=3): >>><<< 11579 1726882201.38800: stdout chunk (state=3): >>><<< 11579 1726882201.38805: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882201.38807: handler run complete 11579 1726882201.38810: attempt loop complete, returning result 11579 1726882201.38812: _execute() done 11579 1726882201.38814: dumping result to json 11579 1726882201.38816: done dumping result, returning 11579 1726882201.38818: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking connection profiles [12673a56-9f93-f197-7423-00000000008c] 11579 1726882201.38820: sending task result for task 12673a56-9f93-f197-7423-00000000008c 11579 1726882201.38903: done sending task result for task 12673a56-9f93-f197-7423-00000000008c 11579 1726882201.38907: WORKER PROCESS EXITING changed: [managed_node1] => { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true } STDERR: 11579 1726882201.39205: no more pending results, returning what we have 11579 1726882201.39209: results queue empty 11579 1726882201.39210: checking for any_errors_fatal 11579 1726882201.39217: done checking for any_errors_fatal 11579 1726882201.39217: checking for max_fail_percentage 11579 1726882201.39219: done checking for max_fail_percentage 11579 1726882201.39220: checking to see if all hosts have failed and the running result is not ok 11579 1726882201.39221: done checking to see if all hosts have failed 11579 1726882201.39222: getting the remaining hosts for this loop 11579 1726882201.39223: done getting the remaining hosts for this loop 11579 1726882201.39226: getting the next task for host managed_node1 11579 1726882201.39233: done getting next task for host managed_node1 11579 1726882201.39236: ^ task is: TASK: fedora.linux_system_roles.network : Configure networking state 11579 1726882201.39239: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=21, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882201.39249: getting variables 11579 1726882201.39251: in VariableManager get_vars() 11579 1726882201.39289: Calling all_inventory to load vars for managed_node1 11579 1726882201.39291: Calling groups_inventory to load vars for managed_node1 11579 1726882201.39441: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882201.39453: Calling all_plugins_play to load vars for managed_node1 11579 1726882201.39456: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882201.39459: Calling groups_plugins_play to load vars for managed_node1 11579 1726882201.41315: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882201.43350: done with get_vars() 11579 1726882201.43382: done getting variables TASK [fedora.linux_system_roles.network : Configure networking state] ********** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:171 Friday 20 September 2024 21:30:01 -0400 (0:00:00.959) 0:00:30.144 ****** 11579 1726882201.43654: entering _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11579 1726882201.44426: worker is 1 (out of 1 available) 11579 1726882201.44436: exiting _queue_task() for managed_node1/fedora.linux_system_roles.network_state 11579 1726882201.44447: done queuing things up, now waiting for results queue to drain 11579 1726882201.44449: waiting for pending results... 11579 1726882201.44621: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state 11579 1726882201.44810: in run() - task 12673a56-9f93-f197-7423-00000000008d 11579 1726882201.44857: variable 'ansible_search_path' from source: unknown 11579 1726882201.44868: variable 'ansible_search_path' from source: unknown 11579 1726882201.44938: calling self._execute() 11579 1726882201.45048: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882201.45199: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882201.45203: variable 'omit' from source: magic vars 11579 1726882201.45479: variable 'ansible_distribution_major_version' from source: facts 11579 1726882201.45501: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882201.45632: variable 'network_state' from source: role '' defaults 11579 1726882201.45650: Evaluated conditional (network_state != {}): False 11579 1726882201.45661: when evaluation is False, skipping this task 11579 1726882201.45668: _execute() done 11579 1726882201.45676: dumping result to json 11579 1726882201.45684: done dumping result, returning 11579 1726882201.45701: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Configure networking state [12673a56-9f93-f197-7423-00000000008d] 11579 1726882201.45712: sending task result for task 12673a56-9f93-f197-7423-00000000008d 11579 1726882201.45837: done sending task result for task 12673a56-9f93-f197-7423-00000000008d 11579 1726882201.45840: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_state != {}", "skip_reason": "Conditional result was False" } 11579 1726882201.45926: no more pending results, returning what we have 11579 1726882201.45930: results queue empty 11579 1726882201.45931: checking for any_errors_fatal 11579 1726882201.45942: done checking for any_errors_fatal 11579 1726882201.45943: checking for max_fail_percentage 11579 1726882201.45944: done checking for max_fail_percentage 11579 1726882201.45945: checking to see if all hosts have failed and the running result is not ok 11579 1726882201.45946: done checking to see if all hosts have failed 11579 1726882201.45947: getting the remaining hosts for this loop 11579 1726882201.45948: done getting the remaining hosts for this loop 11579 1726882201.45951: getting the next task for host managed_node1 11579 1726882201.45960: done getting next task for host managed_node1 11579 1726882201.45964: ^ task is: TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11579 1726882201.45969: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=22, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882201.45991: getting variables 11579 1726882201.45997: in VariableManager get_vars() 11579 1726882201.46047: Calling all_inventory to load vars for managed_node1 11579 1726882201.46050: Calling groups_inventory to load vars for managed_node1 11579 1726882201.46053: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882201.46064: Calling all_plugins_play to load vars for managed_node1 11579 1726882201.46067: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882201.46070: Calling groups_plugins_play to load vars for managed_node1 11579 1726882201.47779: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882201.50452: done with get_vars() 11579 1726882201.50488: done getting variables 11579 1726882201.50549: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show stderr messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:177 Friday 20 September 2024 21:30:01 -0400 (0:00:00.069) 0:00:30.214 ****** 11579 1726882201.50583: entering _queue_task() for managed_node1/debug 11579 1726882201.50922: worker is 1 (out of 1 available) 11579 1726882201.50935: exiting _queue_task() for managed_node1/debug 11579 1726882201.50945: done queuing things up, now waiting for results queue to drain 11579 1726882201.50946: waiting for pending results... 11579 1726882201.51322: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections 11579 1726882201.51361: in run() - task 12673a56-9f93-f197-7423-00000000008e 11579 1726882201.51384: variable 'ansible_search_path' from source: unknown 11579 1726882201.51397: variable 'ansible_search_path' from source: unknown 11579 1726882201.51527: calling self._execute() 11579 1726882201.51550: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882201.51560: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882201.51575: variable 'omit' from source: magic vars 11579 1726882201.51963: variable 'ansible_distribution_major_version' from source: facts 11579 1726882201.51982: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882201.51997: variable 'omit' from source: magic vars 11579 1726882201.52074: variable 'omit' from source: magic vars 11579 1726882201.52299: variable 'omit' from source: magic vars 11579 1726882201.52344: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882201.52408: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882201.52439: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882201.52459: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882201.52475: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882201.52517: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882201.52525: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882201.52534: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882201.53010: Set connection var ansible_timeout to 10 11579 1726882201.53013: Set connection var ansible_shell_type to sh 11579 1726882201.53016: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882201.53017: Set connection var ansible_shell_executable to /bin/sh 11579 1726882201.53019: Set connection var ansible_pipelining to False 11579 1726882201.53021: Set connection var ansible_connection to ssh 11579 1726882201.53023: variable 'ansible_shell_executable' from source: unknown 11579 1726882201.53025: variable 'ansible_connection' from source: unknown 11579 1726882201.53027: variable 'ansible_module_compression' from source: unknown 11579 1726882201.53029: variable 'ansible_shell_type' from source: unknown 11579 1726882201.53030: variable 'ansible_shell_executable' from source: unknown 11579 1726882201.53032: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882201.53034: variable 'ansible_pipelining' from source: unknown 11579 1726882201.53036: variable 'ansible_timeout' from source: unknown 11579 1726882201.53038: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882201.53323: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882201.53454: variable 'omit' from source: magic vars 11579 1726882201.53457: starting attempt loop 11579 1726882201.53460: running the handler 11579 1726882201.53598: variable '__network_connections_result' from source: set_fact 11579 1726882201.53661: handler run complete 11579 1726882201.53677: attempt loop complete, returning result 11579 1726882201.53680: _execute() done 11579 1726882201.53682: dumping result to json 11579 1726882201.53691: done dumping result, returning 11579 1726882201.53770: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show stderr messages for the network_connections [12673a56-9f93-f197-7423-00000000008e] 11579 1726882201.53772: sending task result for task 12673a56-9f93-f197-7423-00000000008e 11579 1726882201.53836: done sending task result for task 12673a56-9f93-f197-7423-00000000008e 11579 1726882201.53839: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result.stderr_lines": [ "" ] } 11579 1726882201.53934: no more pending results, returning what we have 11579 1726882201.53937: results queue empty 11579 1726882201.53938: checking for any_errors_fatal 11579 1726882201.53942: done checking for any_errors_fatal 11579 1726882201.53943: checking for max_fail_percentage 11579 1726882201.53945: done checking for max_fail_percentage 11579 1726882201.53945: checking to see if all hosts have failed and the running result is not ok 11579 1726882201.53946: done checking to see if all hosts have failed 11579 1726882201.53947: getting the remaining hosts for this loop 11579 1726882201.53948: done getting the remaining hosts for this loop 11579 1726882201.53951: getting the next task for host managed_node1 11579 1726882201.53956: done getting next task for host managed_node1 11579 1726882201.53960: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11579 1726882201.53963: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=23, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882201.53972: getting variables 11579 1726882201.53973: in VariableManager get_vars() 11579 1726882201.54069: Calling all_inventory to load vars for managed_node1 11579 1726882201.54072: Calling groups_inventory to load vars for managed_node1 11579 1726882201.54074: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882201.54083: Calling all_plugins_play to load vars for managed_node1 11579 1726882201.54085: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882201.54088: Calling groups_plugins_play to load vars for managed_node1 11579 1726882201.56655: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882201.59836: done with get_vars() 11579 1726882201.59988: done getting variables 11579 1726882201.60053: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_connections] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:181 Friday 20 September 2024 21:30:01 -0400 (0:00:00.095) 0:00:30.309 ****** 11579 1726882201.60128: entering _queue_task() for managed_node1/debug 11579 1726882201.60482: worker is 1 (out of 1 available) 11579 1726882201.60496: exiting _queue_task() for managed_node1/debug 11579 1726882201.60508: done queuing things up, now waiting for results queue to drain 11579 1726882201.60510: waiting for pending results... 11579 1726882201.60890: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections 11579 1726882201.61204: in run() - task 12673a56-9f93-f197-7423-00000000008f 11579 1726882201.61216: variable 'ansible_search_path' from source: unknown 11579 1726882201.61220: variable 'ansible_search_path' from source: unknown 11579 1726882201.61259: calling self._execute() 11579 1726882201.61570: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882201.61576: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882201.61591: variable 'omit' from source: magic vars 11579 1726882201.62413: variable 'ansible_distribution_major_version' from source: facts 11579 1726882201.62428: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882201.62434: variable 'omit' from source: magic vars 11579 1726882201.62511: variable 'omit' from source: magic vars 11579 1726882201.62548: variable 'omit' from source: magic vars 11579 1726882201.62588: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882201.62727: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882201.62731: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882201.62733: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882201.62736: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882201.62738: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882201.62740: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882201.62741: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882201.62844: Set connection var ansible_timeout to 10 11579 1726882201.62850: Set connection var ansible_shell_type to sh 11579 1726882201.62858: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882201.62863: Set connection var ansible_shell_executable to /bin/sh 11579 1726882201.62870: Set connection var ansible_pipelining to False 11579 1726882201.62873: Set connection var ansible_connection to ssh 11579 1726882201.62899: variable 'ansible_shell_executable' from source: unknown 11579 1726882201.62902: variable 'ansible_connection' from source: unknown 11579 1726882201.62905: variable 'ansible_module_compression' from source: unknown 11579 1726882201.62907: variable 'ansible_shell_type' from source: unknown 11579 1726882201.62910: variable 'ansible_shell_executable' from source: unknown 11579 1726882201.62912: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882201.62914: variable 'ansible_pipelining' from source: unknown 11579 1726882201.62922: variable 'ansible_timeout' from source: unknown 11579 1726882201.62925: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882201.63081: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882201.63199: variable 'omit' from source: magic vars 11579 1726882201.63203: starting attempt loop 11579 1726882201.63205: running the handler 11579 1726882201.63207: variable '__network_connections_result' from source: set_fact 11579 1726882201.63235: variable '__network_connections_result' from source: set_fact 11579 1726882201.63368: handler run complete 11579 1726882201.63400: attempt loop complete, returning result 11579 1726882201.63403: _execute() done 11579 1726882201.63406: dumping result to json 11579 1726882201.63411: done dumping result, returning 11579 1726882201.63420: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_connections [12673a56-9f93-f197-7423-00000000008f] 11579 1726882201.63426: sending task result for task 12673a56-9f93-f197-7423-00000000008f 11579 1726882201.63524: done sending task result for task 12673a56-9f93-f197-7423-00000000008f 11579 1726882201.63528: WORKER PROCESS EXITING ok: [managed_node1] => { "__network_connections_result": { "_invocation": { "module_args": { "__debug_flags": "", "__header": "#\n# Ansible managed\n#\n# system_role:network\n", "connections": [ { "name": "bond0.1", "persistent_state": "absent", "state": "down" }, { "name": "bond0.0", "persistent_state": "absent", "state": "down" }, { "name": "bond0", "persistent_state": "absent", "state": "down" } ], "force_state_change": false, "ignore_errors": false, "provider": "nm" } }, "changed": true, "failed": false, "stderr": "\n", "stderr_lines": [ "" ] } } 11579 1726882201.63636: no more pending results, returning what we have 11579 1726882201.63639: results queue empty 11579 1726882201.63640: checking for any_errors_fatal 11579 1726882201.63647: done checking for any_errors_fatal 11579 1726882201.63648: checking for max_fail_percentage 11579 1726882201.63650: done checking for max_fail_percentage 11579 1726882201.63651: checking to see if all hosts have failed and the running result is not ok 11579 1726882201.63652: done checking to see if all hosts have failed 11579 1726882201.63653: getting the remaining hosts for this loop 11579 1726882201.63654: done getting the remaining hosts for this loop 11579 1726882201.63657: getting the next task for host managed_node1 11579 1726882201.63666: done getting next task for host managed_node1 11579 1726882201.63669: ^ task is: TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11579 1726882201.63674: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=24, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882201.63686: getting variables 11579 1726882201.63688: in VariableManager get_vars() 11579 1726882201.63842: Calling all_inventory to load vars for managed_node1 11579 1726882201.63845: Calling groups_inventory to load vars for managed_node1 11579 1726882201.63848: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882201.63859: Calling all_plugins_play to load vars for managed_node1 11579 1726882201.63869: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882201.63872: Calling groups_plugins_play to load vars for managed_node1 11579 1726882201.65342: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882201.66858: done with get_vars() 11579 1726882201.66885: done getting variables 11579 1726882201.66950: Loading ActionModule 'debug' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/debug.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [fedora.linux_system_roles.network : Show debug messages for the network_state] *** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:186 Friday 20 September 2024 21:30:01 -0400 (0:00:00.068) 0:00:30.378 ****** 11579 1726882201.66985: entering _queue_task() for managed_node1/debug 11579 1726882201.67341: worker is 1 (out of 1 available) 11579 1726882201.67470: exiting _queue_task() for managed_node1/debug 11579 1726882201.67481: done queuing things up, now waiting for results queue to drain 11579 1726882201.67482: waiting for pending results... 11579 1726882201.67801: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state 11579 1726882201.67882: in run() - task 12673a56-9f93-f197-7423-000000000090 11579 1726882201.67886: variable 'ansible_search_path' from source: unknown 11579 1726882201.67899: variable 'ansible_search_path' from source: unknown 11579 1726882201.67901: calling self._execute() 11579 1726882201.68003: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882201.68014: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882201.68029: variable 'omit' from source: magic vars 11579 1726882201.68435: variable 'ansible_distribution_major_version' from source: facts 11579 1726882201.68447: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882201.68575: variable 'network_state' from source: role '' defaults 11579 1726882201.68585: Evaluated conditional (network_state != {}): False 11579 1726882201.68588: when evaluation is False, skipping this task 11579 1726882201.68591: _execute() done 11579 1726882201.68598: dumping result to json 11579 1726882201.68600: done dumping result, returning 11579 1726882201.68607: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Show debug messages for the network_state [12673a56-9f93-f197-7423-000000000090] 11579 1726882201.68658: sending task result for task 12673a56-9f93-f197-7423-000000000090 11579 1726882201.68724: done sending task result for task 12673a56-9f93-f197-7423-000000000090 11579 1726882201.68727: WORKER PROCESS EXITING skipping: [managed_node1] => { "false_condition": "network_state != {}" } 11579 1726882201.68816: no more pending results, returning what we have 11579 1726882201.68821: results queue empty 11579 1726882201.68822: checking for any_errors_fatal 11579 1726882201.68831: done checking for any_errors_fatal 11579 1726882201.68832: checking for max_fail_percentage 11579 1726882201.68834: done checking for max_fail_percentage 11579 1726882201.68835: checking to see if all hosts have failed and the running result is not ok 11579 1726882201.68836: done checking to see if all hosts have failed 11579 1726882201.68837: getting the remaining hosts for this loop 11579 1726882201.68839: done getting the remaining hosts for this loop 11579 1726882201.68842: getting the next task for host managed_node1 11579 1726882201.68852: done getting next task for host managed_node1 11579 1726882201.68856: ^ task is: TASK: fedora.linux_system_roles.network : Re-test connectivity 11579 1726882201.68861: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=25, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882201.69070: getting variables 11579 1726882201.69072: in VariableManager get_vars() 11579 1726882201.69109: Calling all_inventory to load vars for managed_node1 11579 1726882201.69112: Calling groups_inventory to load vars for managed_node1 11579 1726882201.69114: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882201.69123: Calling all_plugins_play to load vars for managed_node1 11579 1726882201.69126: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882201.69129: Calling groups_plugins_play to load vars for managed_node1 11579 1726882201.78163: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882201.80271: done with get_vars() 11579 1726882201.80308: done getting variables TASK [fedora.linux_system_roles.network : Re-test connectivity] **************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 Friday 20 September 2024 21:30:01 -0400 (0:00:00.134) 0:00:30.512 ****** 11579 1726882201.80401: entering _queue_task() for managed_node1/ping 11579 1726882201.80780: worker is 1 (out of 1 available) 11579 1726882201.80795: exiting _queue_task() for managed_node1/ping 11579 1726882201.80809: done queuing things up, now waiting for results queue to drain 11579 1726882201.80812: waiting for pending results... 11579 1726882201.81416: running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity 11579 1726882201.81702: in run() - task 12673a56-9f93-f197-7423-000000000091 11579 1726882201.81708: variable 'ansible_search_path' from source: unknown 11579 1726882201.81713: variable 'ansible_search_path' from source: unknown 11579 1726882201.81822: calling self._execute() 11579 1726882201.81966: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882201.81971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882201.81984: variable 'omit' from source: magic vars 11579 1726882201.82401: variable 'ansible_distribution_major_version' from source: facts 11579 1726882201.82415: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882201.82421: variable 'omit' from source: magic vars 11579 1726882201.82489: variable 'omit' from source: magic vars 11579 1726882201.82526: variable 'omit' from source: magic vars 11579 1726882201.82567: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882201.82608: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882201.82627: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882201.82645: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882201.82658: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882201.82691: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882201.82703: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882201.82707: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882201.82811: Set connection var ansible_timeout to 10 11579 1726882201.82818: Set connection var ansible_shell_type to sh 11579 1726882201.82833: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882201.82838: Set connection var ansible_shell_executable to /bin/sh 11579 1726882201.82846: Set connection var ansible_pipelining to False 11579 1726882201.82849: Set connection var ansible_connection to ssh 11579 1726882201.82870: variable 'ansible_shell_executable' from source: unknown 11579 1726882201.82873: variable 'ansible_connection' from source: unknown 11579 1726882201.82876: variable 'ansible_module_compression' from source: unknown 11579 1726882201.82878: variable 'ansible_shell_type' from source: unknown 11579 1726882201.82880: variable 'ansible_shell_executable' from source: unknown 11579 1726882201.82883: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882201.82891: variable 'ansible_pipelining' from source: unknown 11579 1726882201.82898: variable 'ansible_timeout' from source: unknown 11579 1726882201.82901: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882201.83113: Loading ActionModule 'normal' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/normal.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) 11579 1726882201.83124: variable 'omit' from source: magic vars 11579 1726882201.83127: starting attempt loop 11579 1726882201.83130: running the handler 11579 1726882201.83144: _low_level_execute_command(): starting 11579 1726882201.83155: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882201.83900: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass <<< 11579 1726882201.84002: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882201.84344: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882201.84387: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882201.86149: stdout chunk (state=3): >>>/root <<< 11579 1726882201.86277: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882201.86280: stdout chunk (state=3): >>><<< 11579 1726882201.86291: stderr chunk (state=3): >>><<< 11579 1726882201.86424: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882201.86428: _low_level_execute_command(): starting 11579 1726882201.86431: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882201.8631535-13060-44309232555624 `" && echo ansible-tmp-1726882201.8631535-13060-44309232555624="` echo /root/.ansible/tmp/ansible-tmp-1726882201.8631535-13060-44309232555624 `" ) && sleep 0' 11579 1726882201.86956: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882201.86968: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882201.86981: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882201.86997: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882201.87068: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882201.87121: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882201.87168: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882201.87197: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882201.87292: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882201.89272: stdout chunk (state=3): >>>ansible-tmp-1726882201.8631535-13060-44309232555624=/root/.ansible/tmp/ansible-tmp-1726882201.8631535-13060-44309232555624 <<< 11579 1726882201.89601: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882201.89605: stderr chunk (state=3): >>><<< 11579 1726882201.89607: stdout chunk (state=3): >>><<< 11579 1726882201.89610: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882201.8631535-13060-44309232555624=/root/.ansible/tmp/ansible-tmp-1726882201.8631535-13060-44309232555624 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882201.89613: variable 'ansible_module_compression' from source: unknown 11579 1726882201.89614: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.ping-ZIP_DEFLATED 11579 1726882201.89616: variable 'ansible_facts' from source: unknown 11579 1726882201.89652: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882201.8631535-13060-44309232555624/AnsiballZ_ping.py 11579 1726882201.89917: Sending initial data 11579 1726882201.89921: Sent initial data (152 bytes) 11579 1726882201.90549: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882201.90555: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882201.90573: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882201.90578: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882201.90718: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882201.90750: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882201.92359: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11579 1726882201.92386: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882201.92424: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882201.92499: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpw0sy6yw7 /root/.ansible/tmp/ansible-tmp-1726882201.8631535-13060-44309232555624/AnsiballZ_ping.py <<< 11579 1726882201.92503: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882201.8631535-13060-44309232555624/AnsiballZ_ping.py" <<< 11579 1726882201.92535: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpw0sy6yw7" to remote "/root/.ansible/tmp/ansible-tmp-1726882201.8631535-13060-44309232555624/AnsiballZ_ping.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882201.8631535-13060-44309232555624/AnsiballZ_ping.py" <<< 11579 1726882201.93241: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882201.93305: stderr chunk (state=3): >>><<< 11579 1726882201.93376: stdout chunk (state=3): >>><<< 11579 1726882201.93387: done transferring module to remote 11579 1726882201.93405: _low_level_execute_command(): starting 11579 1726882201.93416: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882201.8631535-13060-44309232555624/ /root/.ansible/tmp/ansible-tmp-1726882201.8631535-13060-44309232555624/AnsiballZ_ping.py && sleep 0' 11579 1726882201.94057: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882201.94069: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882201.94105: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882201.94119: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882201.94134: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882201.94211: stderr chunk (state=3): >>>debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882201.94233: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882201.94371: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882201.96383: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882201.96387: stdout chunk (state=3): >>><<< 11579 1726882201.96391: stderr chunk (state=3): >>><<< 11579 1726882201.96402: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882201.96406: _low_level_execute_command(): starting 11579 1726882201.96409: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882201.8631535-13060-44309232555624/AnsiballZ_ping.py && sleep 0' 11579 1726882201.96916: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882201.97009: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882201.97030: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882201.97043: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882201.97066: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882201.97138: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882202.12027: stdout chunk (state=3): >>> {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} <<< 11579 1726882202.13805: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882202.13809: stdout chunk (state=3): >>><<< 11579 1726882202.13812: stderr chunk (state=3): >>><<< 11579 1726882202.13814: _low_level_execute_command() done: rc=0, stdout= {"ping": "pong", "invocation": {"module_args": {"data": "pong"}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882202.13817: done with _execute_module (ping, {'_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ping', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882201.8631535-13060-44309232555624/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882202.13819: _low_level_execute_command(): starting 11579 1726882202.13821: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882201.8631535-13060-44309232555624/ > /dev/null 2>&1 && sleep 0' 11579 1726882202.14532: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882202.14549: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882202.14562: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882202.14579: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882202.14606: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882202.14704: stderr chunk (state=3): >>>debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882202.14828: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882202.14992: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882202.16720: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882202.16784: stderr chunk (state=3): >>><<< 11579 1726882202.16796: stdout chunk (state=3): >>><<< 11579 1726882202.16933: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882202.16940: handler run complete 11579 1726882202.16942: attempt loop complete, returning result 11579 1726882202.16944: _execute() done 11579 1726882202.16946: dumping result to json 11579 1726882202.16948: done dumping result, returning 11579 1726882202.16949: done running TaskExecutor() for managed_node1/TASK: fedora.linux_system_roles.network : Re-test connectivity [12673a56-9f93-f197-7423-000000000091] 11579 1726882202.16951: sending task result for task 12673a56-9f93-f197-7423-000000000091 ok: [managed_node1] => { "changed": false, "ping": "pong" } 11579 1726882202.17173: no more pending results, returning what we have 11579 1726882202.17176: results queue empty 11579 1726882202.17177: checking for any_errors_fatal 11579 1726882202.17186: done checking for any_errors_fatal 11579 1726882202.17186: checking for max_fail_percentage 11579 1726882202.17188: done checking for max_fail_percentage 11579 1726882202.17189: checking to see if all hosts have failed and the running result is not ok 11579 1726882202.17190: done checking to see if all hosts have failed 11579 1726882202.17190: getting the remaining hosts for this loop 11579 1726882202.17192: done getting the remaining hosts for this loop 11579 1726882202.17199: getting the next task for host managed_node1 11579 1726882202.17209: done getting next task for host managed_node1 11579 1726882202.17211: ^ task is: TASK: meta (role_complete) 11579 1726882202.17215: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882202.17228: getting variables 11579 1726882202.17230: in VariableManager get_vars() 11579 1726882202.17277: Calling all_inventory to load vars for managed_node1 11579 1726882202.17280: Calling groups_inventory to load vars for managed_node1 11579 1726882202.17282: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882202.17408: Calling all_plugins_play to load vars for managed_node1 11579 1726882202.17414: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882202.17418: Calling groups_plugins_play to load vars for managed_node1 11579 1726882202.18575: done sending task result for task 12673a56-9f93-f197-7423-000000000091 11579 1726882202.18579: WORKER PROCESS EXITING 11579 1726882202.19177: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882202.21224: done with get_vars() 11579 1726882202.21252: done getting variables 11579 1726882202.21551: done queuing things up, now waiting for results queue to drain 11579 1726882202.21553: results queue empty 11579 1726882202.21554: checking for any_errors_fatal 11579 1726882202.21556: done checking for any_errors_fatal 11579 1726882202.21557: checking for max_fail_percentage 11579 1726882202.21558: done checking for max_fail_percentage 11579 1726882202.21559: checking to see if all hosts have failed and the running result is not ok 11579 1726882202.21559: done checking to see if all hosts have failed 11579 1726882202.21560: getting the remaining hosts for this loop 11579 1726882202.21561: done getting the remaining hosts for this loop 11579 1726882202.21564: getting the next task for host managed_node1 11579 1726882202.21568: done getting next task for host managed_node1 11579 1726882202.21570: ^ task is: TASK: Delete the device '{{ controller_device }}' 11579 1726882202.21572: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=3, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882202.21574: getting variables 11579 1726882202.21575: in VariableManager get_vars() 11579 1726882202.21591: Calling all_inventory to load vars for managed_node1 11579 1726882202.21700: Calling groups_inventory to load vars for managed_node1 11579 1726882202.21703: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882202.21709: Calling all_plugins_play to load vars for managed_node1 11579 1726882202.21711: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882202.21719: Calling groups_plugins_play to load vars for managed_node1 11579 1726882202.23439: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882202.25895: done with get_vars() 11579 1726882202.25917: done getting variables 11579 1726882202.26029: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) 11579 1726882202.26174: variable 'controller_device' from source: play vars TASK [Delete the device 'nm-bond'] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:114 Friday 20 September 2024 21:30:02 -0400 (0:00:00.458) 0:00:30.970 ****** 11579 1726882202.26206: entering _queue_task() for managed_node1/command 11579 1726882202.27022: worker is 1 (out of 1 available) 11579 1726882202.27116: exiting _queue_task() for managed_node1/command 11579 1726882202.27129: done queuing things up, now waiting for results queue to drain 11579 1726882202.27130: waiting for pending results... 11579 1726882202.27373: running TaskExecutor() for managed_node1/TASK: Delete the device 'nm-bond' 11579 1726882202.27579: in run() - task 12673a56-9f93-f197-7423-0000000000c1 11579 1726882202.27584: variable 'ansible_search_path' from source: unknown 11579 1726882202.27586: calling self._execute() 11579 1726882202.27669: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882202.27688: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882202.27708: variable 'omit' from source: magic vars 11579 1726882202.28101: variable 'ansible_distribution_major_version' from source: facts 11579 1726882202.28124: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882202.28136: variable 'omit' from source: magic vars 11579 1726882202.28162: variable 'omit' from source: magic vars 11579 1726882202.28265: variable 'controller_device' from source: play vars 11579 1726882202.28289: variable 'omit' from source: magic vars 11579 1726882202.28443: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882202.28447: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882202.28450: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882202.28452: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882202.28454: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882202.28479: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882202.28488: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882202.28499: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882202.28610: Set connection var ansible_timeout to 10 11579 1726882202.28624: Set connection var ansible_shell_type to sh 11579 1726882202.28637: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882202.28647: Set connection var ansible_shell_executable to /bin/sh 11579 1726882202.28667: Set connection var ansible_pipelining to False 11579 1726882202.28674: Set connection var ansible_connection to ssh 11579 1726882202.28703: variable 'ansible_shell_executable' from source: unknown 11579 1726882202.28712: variable 'ansible_connection' from source: unknown 11579 1726882202.28751: variable 'ansible_module_compression' from source: unknown 11579 1726882202.28772: variable 'ansible_shell_type' from source: unknown 11579 1726882202.28779: variable 'ansible_shell_executable' from source: unknown 11579 1726882202.28786: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882202.28792: variable 'ansible_pipelining' from source: unknown 11579 1726882202.28802: variable 'ansible_timeout' from source: unknown 11579 1726882202.28810: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882202.29103: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882202.29108: variable 'omit' from source: magic vars 11579 1726882202.29110: starting attempt loop 11579 1726882202.29112: running the handler 11579 1726882202.29114: _low_level_execute_command(): starting 11579 1726882202.29115: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882202.29733: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882202.29756: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882202.29769: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882202.29786: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882202.29805: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882202.29815: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882202.29872: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882202.29926: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882202.29940: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882202.29970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882202.30113: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882202.31768: stdout chunk (state=3): >>>/root <<< 11579 1726882202.31903: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882202.31951: stderr chunk (state=3): >>><<< 11579 1726882202.31955: stdout chunk (state=3): >>><<< 11579 1726882202.32105: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882202.32109: _low_level_execute_command(): starting 11579 1726882202.32112: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882202.3198528-13089-249059032343345 `" && echo ansible-tmp-1726882202.3198528-13089-249059032343345="` echo /root/.ansible/tmp/ansible-tmp-1726882202.3198528-13089-249059032343345 `" ) && sleep 0' 11579 1726882202.32687: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882202.32707: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882202.32752: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882202.32861: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882202.32865: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882202.32883: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882202.32910: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882202.32924: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882202.33000: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882202.35041: stdout chunk (state=3): >>>ansible-tmp-1726882202.3198528-13089-249059032343345=/root/.ansible/tmp/ansible-tmp-1726882202.3198528-13089-249059032343345 <<< 11579 1726882202.35073: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882202.35076: stdout chunk (state=3): >>><<< 11579 1726882202.35079: stderr chunk (state=3): >>><<< 11579 1726882202.35105: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882202.3198528-13089-249059032343345=/root/.ansible/tmp/ansible-tmp-1726882202.3198528-13089-249059032343345 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882202.35144: variable 'ansible_module_compression' from source: unknown 11579 1726882202.35279: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11579 1726882202.35282: variable 'ansible_facts' from source: unknown 11579 1726882202.35354: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882202.3198528-13089-249059032343345/AnsiballZ_command.py 11579 1726882202.35514: Sending initial data 11579 1726882202.35528: Sent initial data (156 bytes) 11579 1726882202.36165: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882202.36282: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882202.36316: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882202.36391: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882202.37930: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11579 1726882202.37953: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882202.38010: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882202.38064: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpc0rpxbji /root/.ansible/tmp/ansible-tmp-1726882202.3198528-13089-249059032343345/AnsiballZ_command.py <<< 11579 1726882202.38071: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882202.3198528-13089-249059032343345/AnsiballZ_command.py" <<< 11579 1726882202.38100: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpc0rpxbji" to remote "/root/.ansible/tmp/ansible-tmp-1726882202.3198528-13089-249059032343345/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882202.3198528-13089-249059032343345/AnsiballZ_command.py" <<< 11579 1726882202.38829: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882202.38998: stderr chunk (state=3): >>><<< 11579 1726882202.39001: stdout chunk (state=3): >>><<< 11579 1726882202.39003: done transferring module to remote 11579 1726882202.39005: _low_level_execute_command(): starting 11579 1726882202.39008: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882202.3198528-13089-249059032343345/ /root/.ansible/tmp/ansible-tmp-1726882202.3198528-13089-249059032343345/AnsiballZ_command.py && sleep 0' 11579 1726882202.39692: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882202.39698: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882202.39700: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882202.39703: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882202.39705: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882202.39708: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882202.39709: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882202.39712: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882202.39714: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 11579 1726882202.39720: stderr chunk (state=3): >>>debug1: re-parsing configuration <<< 11579 1726882202.39722: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882202.39724: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882202.39726: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882202.39728: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882202.39730: stderr chunk (state=3): >>>debug2: match found <<< 11579 1726882202.39732: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882202.39734: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882202.39736: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882202.39872: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882202.39927: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882202.41818: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882202.41821: stdout chunk (state=3): >>><<< 11579 1726882202.41824: stderr chunk (state=3): >>><<< 11579 1726882202.41826: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882202.41829: _low_level_execute_command(): starting 11579 1726882202.41832: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882202.3198528-13089-249059032343345/AnsiballZ_command.py && sleep 0' 11579 1726882202.42317: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882202.42329: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882202.42340: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882202.42411: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882202.42451: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882202.42464: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882202.42483: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882202.42563: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882202.58454: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:30:02.575386", "end": "2024-09-20 21:30:02.582847", "delta": "0:00:00.007461", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11579 1726882202.60009: stderr chunk (state=3): >>>debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. <<< 11579 1726882202.60013: stdout chunk (state=3): >>><<< 11579 1726882202.60015: stderr chunk (state=3): >>><<< 11579 1726882202.60037: _low_level_execute_command() done: rc=1, stdout= {"changed": true, "stdout": "", "stderr": "Cannot find device \"nm-bond\"", "rc": 1, "cmd": ["ip", "link", "del", "nm-bond"], "start": "2024-09-20 21:30:02.575386", "end": "2024-09-20 21:30:02.582847", "delta": "0:00:00.007461", "failed": true, "msg": "non-zero return code", "invocation": {"module_args": {"_raw_params": "ip link del nm-bond", "_uses_shell": false, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 1 Shared connection to 10.31.9.159 closed. 11579 1726882202.60081: done with _execute_module (ansible.legacy.command, {'_raw_params': 'ip link del nm-bond', '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882202.3198528-13089-249059032343345/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882202.60110: _low_level_execute_command(): starting 11579 1726882202.60116: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882202.3198528-13089-249059032343345/ > /dev/null 2>&1 && sleep 0' 11579 1726882202.60773: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882202.60869: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882202.60878: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882202.61010: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882202.61063: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882202.63404: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882202.63408: stdout chunk (state=3): >>><<< 11579 1726882202.63411: stderr chunk (state=3): >>><<< 11579 1726882202.63414: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882202.63416: handler run complete 11579 1726882202.63418: Evaluated conditional (False): False 11579 1726882202.63420: Evaluated conditional (False): False 11579 1726882202.63422: attempt loop complete, returning result 11579 1726882202.63424: _execute() done 11579 1726882202.63426: dumping result to json 11579 1726882202.63428: done dumping result, returning 11579 1726882202.63430: done running TaskExecutor() for managed_node1/TASK: Delete the device 'nm-bond' [12673a56-9f93-f197-7423-0000000000c1] 11579 1726882202.63432: sending task result for task 12673a56-9f93-f197-7423-0000000000c1 11579 1726882202.63509: done sending task result for task 12673a56-9f93-f197-7423-0000000000c1 11579 1726882202.63512: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": [ "ip", "link", "del", "nm-bond" ], "delta": "0:00:00.007461", "end": "2024-09-20 21:30:02.582847", "failed_when_result": false, "rc": 1, "start": "2024-09-20 21:30:02.575386" } STDERR: Cannot find device "nm-bond" MSG: non-zero return code 11579 1726882202.63578: no more pending results, returning what we have 11579 1726882202.63581: results queue empty 11579 1726882202.63582: checking for any_errors_fatal 11579 1726882202.63583: done checking for any_errors_fatal 11579 1726882202.63584: checking for max_fail_percentage 11579 1726882202.63586: done checking for max_fail_percentage 11579 1726882202.63587: checking to see if all hosts have failed and the running result is not ok 11579 1726882202.63588: done checking to see if all hosts have failed 11579 1726882202.63588: getting the remaining hosts for this loop 11579 1726882202.63590: done getting the remaining hosts for this loop 11579 1726882202.63592: getting the next task for host managed_node1 11579 1726882202.63603: done getting next task for host managed_node1 11579 1726882202.63605: ^ task is: TASK: Remove test interfaces 11579 1726882202.63609: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882202.63613: getting variables 11579 1726882202.63615: in VariableManager get_vars() 11579 1726882202.63658: Calling all_inventory to load vars for managed_node1 11579 1726882202.63660: Calling groups_inventory to load vars for managed_node1 11579 1726882202.63662: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882202.63674: Calling all_plugins_play to load vars for managed_node1 11579 1726882202.63676: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882202.63679: Calling groups_plugins_play to load vars for managed_node1 11579 1726882202.67338: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882202.69440: done with get_vars() 11579 1726882202.69469: done getting variables 11579 1726882202.69539: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Remove test interfaces] ************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:3 Friday 20 September 2024 21:30:02 -0400 (0:00:00.433) 0:00:31.404 ****** 11579 1726882202.69583: entering _queue_task() for managed_node1/shell 11579 1726882202.69968: worker is 1 (out of 1 available) 11579 1726882202.69980: exiting _queue_task() for managed_node1/shell 11579 1726882202.70105: done queuing things up, now waiting for results queue to drain 11579 1726882202.70108: waiting for pending results... 11579 1726882202.70388: running TaskExecutor() for managed_node1/TASK: Remove test interfaces 11579 1726882202.70505: in run() - task 12673a56-9f93-f197-7423-0000000000c5 11579 1726882202.70563: variable 'ansible_search_path' from source: unknown 11579 1726882202.70592: variable 'ansible_search_path' from source: unknown 11579 1726882202.70635: calling self._execute() 11579 1726882202.70760: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882202.70773: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882202.70801: variable 'omit' from source: magic vars 11579 1726882202.71226: variable 'ansible_distribution_major_version' from source: facts 11579 1726882202.71235: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882202.71299: variable 'omit' from source: magic vars 11579 1726882202.71315: variable 'omit' from source: magic vars 11579 1726882202.71491: variable 'dhcp_interface1' from source: play vars 11579 1726882202.71507: variable 'dhcp_interface2' from source: play vars 11579 1726882202.71537: variable 'omit' from source: magic vars 11579 1726882202.71587: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882202.71638: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882202.71662: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882202.71683: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882202.71740: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882202.71743: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882202.71745: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882202.71747: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882202.71838: Set connection var ansible_timeout to 10 11579 1726882202.71854: Set connection var ansible_shell_type to sh 11579 1726882202.71864: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882202.71872: Set connection var ansible_shell_executable to /bin/sh 11579 1726882202.71885: Set connection var ansible_pipelining to False 11579 1726882202.71890: Set connection var ansible_connection to ssh 11579 1726882202.71914: variable 'ansible_shell_executable' from source: unknown 11579 1726882202.71957: variable 'ansible_connection' from source: unknown 11579 1726882202.71959: variable 'ansible_module_compression' from source: unknown 11579 1726882202.71962: variable 'ansible_shell_type' from source: unknown 11579 1726882202.71964: variable 'ansible_shell_executable' from source: unknown 11579 1726882202.71965: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882202.71967: variable 'ansible_pipelining' from source: unknown 11579 1726882202.71969: variable 'ansible_timeout' from source: unknown 11579 1726882202.71971: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882202.72088: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882202.72110: variable 'omit' from source: magic vars 11579 1726882202.72119: starting attempt loop 11579 1726882202.72174: running the handler 11579 1726882202.72178: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882202.72181: _low_level_execute_command(): starting 11579 1726882202.72183: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882202.72880: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882202.72916: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882202.72944: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882202.72988: stderr chunk (state=3): >>>debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882202.73051: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882202.73072: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882202.73107: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882202.73181: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882202.74862: stdout chunk (state=3): >>>/root <<< 11579 1726882202.75106: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882202.75110: stdout chunk (state=3): >>><<< 11579 1726882202.75113: stderr chunk (state=3): >>><<< 11579 1726882202.75117: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882202.75119: _low_level_execute_command(): starting 11579 1726882202.75122: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882202.7502646-13120-27582711128244 `" && echo ansible-tmp-1726882202.7502646-13120-27582711128244="` echo /root/.ansible/tmp/ansible-tmp-1726882202.7502646-13120-27582711128244 `" ) && sleep 0' 11579 1726882202.75710: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882202.75736: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882202.75762: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882202.75783: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882202.75879: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882202.75903: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882202.75935: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882202.76007: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882202.77937: stdout chunk (state=3): >>>ansible-tmp-1726882202.7502646-13120-27582711128244=/root/.ansible/tmp/ansible-tmp-1726882202.7502646-13120-27582711128244 <<< 11579 1726882202.78070: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882202.78074: stdout chunk (state=3): >>><<< 11579 1726882202.78077: stderr chunk (state=3): >>><<< 11579 1726882202.78208: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882202.7502646-13120-27582711128244=/root/.ansible/tmp/ansible-tmp-1726882202.7502646-13120-27582711128244 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882202.78212: variable 'ansible_module_compression' from source: unknown 11579 1726882202.78214: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11579 1726882202.78250: variable 'ansible_facts' from source: unknown 11579 1726882202.78347: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882202.7502646-13120-27582711128244/AnsiballZ_command.py 11579 1726882202.78576: Sending initial data 11579 1726882202.78580: Sent initial data (155 bytes) 11579 1726882202.79166: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882202.79181: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882202.79199: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882202.79321: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882202.79361: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882202.79402: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882202.81029: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882202.81068: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882202.81124: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmp9vk_6vcj /root/.ansible/tmp/ansible-tmp-1726882202.7502646-13120-27582711128244/AnsiballZ_command.py <<< 11579 1726882202.81127: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882202.7502646-13120-27582711128244/AnsiballZ_command.py" <<< 11579 1726882202.81169: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmp9vk_6vcj" to remote "/root/.ansible/tmp/ansible-tmp-1726882202.7502646-13120-27582711128244/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882202.7502646-13120-27582711128244/AnsiballZ_command.py" <<< 11579 1726882202.81964: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882202.81998: stderr chunk (state=3): >>><<< 11579 1726882202.82113: stdout chunk (state=3): >>><<< 11579 1726882202.82117: done transferring module to remote 11579 1726882202.82119: _low_level_execute_command(): starting 11579 1726882202.82122: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882202.7502646-13120-27582711128244/ /root/.ansible/tmp/ansible-tmp-1726882202.7502646-13120-27582711128244/AnsiballZ_command.py && sleep 0' 11579 1726882202.82667: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882202.82680: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882202.82699: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882202.82750: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882202.82818: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882202.82837: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882202.82882: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882202.82931: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882202.84708: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882202.84751: stderr chunk (state=3): >>><<< 11579 1726882202.84754: stdout chunk (state=3): >>><<< 11579 1726882202.84842: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882202.84845: _low_level_execute_command(): starting 11579 1726882202.84848: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882202.7502646-13120-27582711128244/AnsiballZ_command.py && sleep 0' 11579 1726882202.85341: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882202.85356: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882202.85370: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882202.85407: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882202.85422: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882202.85504: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882202.85530: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882202.85606: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882203.05081: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:30:03.006224", "end": "2024-09-20 21:30:03.049216", "delta": "0:00:00.042992", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11579 1726882203.06714: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882203.06900: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 11579 1726882203.06904: stdout chunk (state=3): >>><<< 11579 1726882203.06906: stderr chunk (state=3): >>><<< 11579 1726882203.06909: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ rc=0\n+ ip link delete test1\n+ '[' 0 '!=' 0 ']'\n+ ip link delete test2\n+ '[' 0 '!=' 0 ']'\n+ ip link delete testbr\n+ '[' 0 '!=' 0 ']'", "rc": 0, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "start": "2024-09-20 21:30:03.006224", "end": "2024-09-20 21:30:03.049216", "delta": "0:00:00.042992", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882203.06917: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test1 - error "$rc"\nfi\nip link delete test2 || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link test2 - error "$rc"\nfi\nip link delete testbr || rc="$?"\nif [ "$rc" != 0 ]; then\n echo ERROR - could not delete link testbr - error "$rc"\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882202.7502646-13120-27582711128244/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882203.06919: _low_level_execute_command(): starting 11579 1726882203.06921: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882202.7502646-13120-27582711128244/ > /dev/null 2>&1 && sleep 0' 11579 1726882203.07437: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882203.07475: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882203.07478: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882203.07481: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882203.07483: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882203.07485: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882203.07499: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882203.07555: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882203.07558: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 11579 1726882203.07644: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882203.07647: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882203.07691: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882203.09546: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882203.09557: stderr chunk (state=3): >>><<< 11579 1726882203.09564: stdout chunk (state=3): >>><<< 11579 1726882203.09582: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882203.09592: handler run complete 11579 1726882203.09800: Evaluated conditional (False): False 11579 1726882203.09804: attempt loop complete, returning result 11579 1726882203.09809: _execute() done 11579 1726882203.09812: dumping result to json 11579 1726882203.09814: done dumping result, returning 11579 1726882203.09816: done running TaskExecutor() for managed_node1/TASK: Remove test interfaces [12673a56-9f93-f197-7423-0000000000c5] 11579 1726882203.09817: sending task result for task 12673a56-9f93-f197-7423-0000000000c5 11579 1726882203.09891: done sending task result for task 12673a56-9f93-f197-7423-0000000000c5 11579 1726882203.09906: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euxo pipefail\nexec 1>&2\nrc=0\nip link delete test1 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test1 - error \"$rc\"\nfi\nip link delete test2 || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link test2 - error \"$rc\"\nfi\nip link delete testbr || rc=\"$?\"\nif [ \"$rc\" != 0 ]; then\n echo ERROR - could not delete link testbr - error \"$rc\"\nfi\n", "delta": "0:00:00.042992", "end": "2024-09-20 21:30:03.049216", "rc": 0, "start": "2024-09-20 21:30:03.006224" } STDERR: + exec + rc=0 + ip link delete test1 + '[' 0 '!=' 0 ']' + ip link delete test2 + '[' 0 '!=' 0 ']' + ip link delete testbr + '[' 0 '!=' 0 ']' 11579 1726882203.09973: no more pending results, returning what we have 11579 1726882203.09975: results queue empty 11579 1726882203.09976: checking for any_errors_fatal 11579 1726882203.09984: done checking for any_errors_fatal 11579 1726882203.09985: checking for max_fail_percentage 11579 1726882203.09987: done checking for max_fail_percentage 11579 1726882203.09987: checking to see if all hosts have failed and the running result is not ok 11579 1726882203.09988: done checking to see if all hosts have failed 11579 1726882203.09989: getting the remaining hosts for this loop 11579 1726882203.09990: done getting the remaining hosts for this loop 11579 1726882203.09997: getting the next task for host managed_node1 11579 1726882203.10004: done getting next task for host managed_node1 11579 1726882203.10006: ^ task is: TASK: Stop dnsmasq/radvd services 11579 1726882203.10009: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=4, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882203.10013: getting variables 11579 1726882203.10015: in VariableManager get_vars() 11579 1726882203.10050: Calling all_inventory to load vars for managed_node1 11579 1726882203.10053: Calling groups_inventory to load vars for managed_node1 11579 1726882203.10059: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882203.10068: Calling all_plugins_play to load vars for managed_node1 11579 1726882203.10070: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882203.10073: Calling groups_plugins_play to load vars for managed_node1 11579 1726882203.11374: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882203.12869: done with get_vars() 11579 1726882203.12899: done getting variables 11579 1726882203.12962: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Stop dnsmasq/radvd services] ********************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/remove_test_interfaces_with_dhcp.yml:23 Friday 20 September 2024 21:30:03 -0400 (0:00:00.434) 0:00:31.838 ****** 11579 1726882203.13001: entering _queue_task() for managed_node1/shell 11579 1726882203.13369: worker is 1 (out of 1 available) 11579 1726882203.13383: exiting _queue_task() for managed_node1/shell 11579 1726882203.13596: done queuing things up, now waiting for results queue to drain 11579 1726882203.13599: waiting for pending results... 11579 1726882203.13689: running TaskExecutor() for managed_node1/TASK: Stop dnsmasq/radvd services 11579 1726882203.13841: in run() - task 12673a56-9f93-f197-7423-0000000000c6 11579 1726882203.13864: variable 'ansible_search_path' from source: unknown 11579 1726882203.13872: variable 'ansible_search_path' from source: unknown 11579 1726882203.13916: calling self._execute() 11579 1726882203.14021: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882203.14035: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882203.14059: variable 'omit' from source: magic vars 11579 1726882203.14454: variable 'ansible_distribution_major_version' from source: facts 11579 1726882203.14482: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882203.14486: variable 'omit' from source: magic vars 11579 1726882203.14592: variable 'omit' from source: magic vars 11579 1726882203.14596: variable 'omit' from source: magic vars 11579 1726882203.14634: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882203.14675: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882203.14707: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882203.14730: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882203.14748: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882203.14784: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882203.14808: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882203.14811: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882203.14909: Set connection var ansible_timeout to 10 11579 1726882203.15099: Set connection var ansible_shell_type to sh 11579 1726882203.15102: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882203.15104: Set connection var ansible_shell_executable to /bin/sh 11579 1726882203.15107: Set connection var ansible_pipelining to False 11579 1726882203.15109: Set connection var ansible_connection to ssh 11579 1726882203.15111: variable 'ansible_shell_executable' from source: unknown 11579 1726882203.15113: variable 'ansible_connection' from source: unknown 11579 1726882203.15115: variable 'ansible_module_compression' from source: unknown 11579 1726882203.15116: variable 'ansible_shell_type' from source: unknown 11579 1726882203.15118: variable 'ansible_shell_executable' from source: unknown 11579 1726882203.15120: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882203.15122: variable 'ansible_pipelining' from source: unknown 11579 1726882203.15123: variable 'ansible_timeout' from source: unknown 11579 1726882203.15125: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882203.15161: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882203.15178: variable 'omit' from source: magic vars 11579 1726882203.15189: starting attempt loop 11579 1726882203.15199: running the handler 11579 1726882203.15215: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882203.15247: _low_level_execute_command(): starting 11579 1726882203.15261: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882203.15965: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882203.16008: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882203.16021: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882203.16037: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882203.16120: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882203.16134: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882203.16151: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882203.16182: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882203.16251: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882203.17909: stdout chunk (state=3): >>>/root <<< 11579 1726882203.18060: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882203.18064: stdout chunk (state=3): >>><<< 11579 1726882203.18066: stderr chunk (state=3): >>><<< 11579 1726882203.18099: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882203.18126: _low_level_execute_command(): starting 11579 1726882203.18135: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882203.181133-13146-129200503899960 `" && echo ansible-tmp-1726882203.181133-13146-129200503899960="` echo /root/.ansible/tmp/ansible-tmp-1726882203.181133-13146-129200503899960 `" ) && sleep 0' 11579 1726882203.18790: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882203.18809: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882203.18835: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882203.18856: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882203.18962: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882203.18983: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882203.19004: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882203.19076: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882203.20966: stdout chunk (state=3): >>>ansible-tmp-1726882203.181133-13146-129200503899960=/root/.ansible/tmp/ansible-tmp-1726882203.181133-13146-129200503899960 <<< 11579 1726882203.21116: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882203.21126: stdout chunk (state=3): >>><<< 11579 1726882203.21163: stderr chunk (state=3): >>><<< 11579 1726882203.21181: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882203.181133-13146-129200503899960=/root/.ansible/tmp/ansible-tmp-1726882203.181133-13146-129200503899960 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882203.21403: variable 'ansible_module_compression' from source: unknown 11579 1726882203.21407: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11579 1726882203.21409: variable 'ansible_facts' from source: unknown 11579 1726882203.21601: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882203.181133-13146-129200503899960/AnsiballZ_command.py 11579 1726882203.22032: Sending initial data 11579 1726882203.22036: Sent initial data (155 bytes) 11579 1726882203.23056: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882203.23109: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882203.23173: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882203.23199: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882203.23236: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882203.23280: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882203.24879: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882203.24883: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882203.24923: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpnenl3jw2 /root/.ansible/tmp/ansible-tmp-1726882203.181133-13146-129200503899960/AnsiballZ_command.py <<< 11579 1726882203.24945: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882203.181133-13146-129200503899960/AnsiballZ_command.py" <<< 11579 1726882203.24966: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpnenl3jw2" to remote "/root/.ansible/tmp/ansible-tmp-1726882203.181133-13146-129200503899960/AnsiballZ_command.py" <<< 11579 1726882203.24974: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882203.181133-13146-129200503899960/AnsiballZ_command.py" <<< 11579 1726882203.25958: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882203.26023: stderr chunk (state=3): >>><<< 11579 1726882203.26029: stdout chunk (state=3): >>><<< 11579 1726882203.26051: done transferring module to remote 11579 1726882203.26098: _low_level_execute_command(): starting 11579 1726882203.26101: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882203.181133-13146-129200503899960/ /root/.ansible/tmp/ansible-tmp-1726882203.181133-13146-129200503899960/AnsiballZ_command.py && sleep 0' 11579 1726882203.26713: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882203.26750: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882203.26760: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882203.26776: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882203.26860: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882203.28787: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882203.28790: stdout chunk (state=3): >>><<< 11579 1726882203.28792: stderr chunk (state=3): >>><<< 11579 1726882203.28815: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882203.28825: _low_level_execute_command(): starting 11579 1726882203.28905: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882203.181133-13146-129200503899960/AnsiballZ_command.py && sleep 0' 11579 1726882203.29519: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882203.29523: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 11579 1726882203.29525: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 11579 1726882203.29528: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found <<< 11579 1726882203.29531: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882203.29617: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882203.29664: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882203.47459: stdout chunk (state=3): >>> {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:30:03.445617", "end": "2024-09-20 21:30:03.472811", "delta": "0:00:00.027194", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11579 1726882203.49091: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882203.49097: stdout chunk (state=3): >>><<< 11579 1726882203.49100: stderr chunk (state=3): >>><<< 11579 1726882203.49124: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "", "stderr": "+ exec\n+ pkill -F /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.pid\n+ rm -rf /run/dhcp_testbr.lease\n+ grep 'release 6' /etc/redhat-release\n+ systemctl is-active firewalld\ninactive", "rc": 0, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "start": "2024-09-20 21:30:03.445617", "end": "2024-09-20 21:30:03.472811", "delta": "0:00:00.027194", "msg": "", "invocation": {"module_args": {"_raw_params": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882203.49176: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep \'release 6\' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service="$service"; then\n firewall-cmd --remove-service "$service"\n fi\n done\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882203.181133-13146-129200503899960/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882203.49277: _low_level_execute_command(): starting 11579 1726882203.49281: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882203.181133-13146-129200503899960/ > /dev/null 2>&1 && sleep 0' 11579 1726882203.49911: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882203.49930: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882203.49954: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882203.49970: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882203.50115: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882203.51986: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882203.52301: stdout chunk (state=3): >>><<< 11579 1726882203.52305: stderr chunk (state=3): >>><<< 11579 1726882203.52307: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882203.52309: handler run complete 11579 1726882203.52311: Evaluated conditional (False): False 11579 1726882203.52313: attempt loop complete, returning result 11579 1726882203.52315: _execute() done 11579 1726882203.52316: dumping result to json 11579 1726882203.52318: done dumping result, returning 11579 1726882203.52320: done running TaskExecutor() for managed_node1/TASK: Stop dnsmasq/radvd services [12673a56-9f93-f197-7423-0000000000c6] 11579 1726882203.52329: sending task result for task 12673a56-9f93-f197-7423-0000000000c6 11579 1726882203.52406: done sending task result for task 12673a56-9f93-f197-7423-0000000000c6 11579 1726882203.52409: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -uxo pipefail\nexec 1>&2\npkill -F /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.pid\nrm -rf /run/dhcp_testbr.lease\nif grep 'release 6' /etc/redhat-release; then\n # Stop radvd server\n service radvd stop\n iptables -D INPUT -i testbr -p udp --dport 67:68 --sport 67:68 -j ACCEPT\nfi\nif systemctl is-active firewalld; then\n for service in dhcp dhcpv6 dhcpv6-client; do\n if firewall-cmd --query-service=\"$service\"; then\n firewall-cmd --remove-service \"$service\"\n fi\n done\nfi\n", "delta": "0:00:00.027194", "end": "2024-09-20 21:30:03.472811", "rc": 0, "start": "2024-09-20 21:30:03.445617" } STDERR: + exec + pkill -F /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.pid + rm -rf /run/dhcp_testbr.lease + grep 'release 6' /etc/redhat-release + systemctl is-active firewalld inactive 11579 1726882203.52495: no more pending results, returning what we have 11579 1726882203.52499: results queue empty 11579 1726882203.52500: checking for any_errors_fatal 11579 1726882203.52511: done checking for any_errors_fatal 11579 1726882203.52512: checking for max_fail_percentage 11579 1726882203.52515: done checking for max_fail_percentage 11579 1726882203.52516: checking to see if all hosts have failed and the running result is not ok 11579 1726882203.52517: done checking to see if all hosts have failed 11579 1726882203.52518: getting the remaining hosts for this loop 11579 1726882203.52519: done getting the remaining hosts for this loop 11579 1726882203.52522: getting the next task for host managed_node1 11579 1726882203.52532: done getting next task for host managed_node1 11579 1726882203.52534: ^ task is: TASK: Restore the /etc/resolv.conf for initscript 11579 1726882203.52537: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=5, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882203.52541: getting variables 11579 1726882203.52544: in VariableManager get_vars() 11579 1726882203.52589: Calling all_inventory to load vars for managed_node1 11579 1726882203.52592: Calling groups_inventory to load vars for managed_node1 11579 1726882203.52710: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882203.52722: Calling all_plugins_play to load vars for managed_node1 11579 1726882203.52725: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882203.52728: Calling groups_plugins_play to load vars for managed_node1 11579 1726882203.54473: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882203.56841: done with get_vars() 11579 1726882203.56866: done getting variables 11579 1726882203.56960: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Restore the /etc/resolv.conf for initscript] ***************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:120 Friday 20 September 2024 21:30:03 -0400 (0:00:00.440) 0:00:32.278 ****** 11579 1726882203.57018: entering _queue_task() for managed_node1/command 11579 1726882203.57476: worker is 1 (out of 1 available) 11579 1726882203.57492: exiting _queue_task() for managed_node1/command 11579 1726882203.57509: done queuing things up, now waiting for results queue to drain 11579 1726882203.57511: waiting for pending results... 11579 1726882203.57917: running TaskExecutor() for managed_node1/TASK: Restore the /etc/resolv.conf for initscript 11579 1726882203.58040: in run() - task 12673a56-9f93-f197-7423-0000000000c7 11579 1726882203.58084: variable 'ansible_search_path' from source: unknown 11579 1726882203.58117: calling self._execute() 11579 1726882203.58233: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882203.58400: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882203.58403: variable 'omit' from source: magic vars 11579 1726882203.58643: variable 'ansible_distribution_major_version' from source: facts 11579 1726882203.58659: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882203.58787: variable 'network_provider' from source: set_fact 11579 1726882203.58802: Evaluated conditional (network_provider == "initscripts"): False 11579 1726882203.58811: when evaluation is False, skipping this task 11579 1726882203.58819: _execute() done 11579 1726882203.58828: dumping result to json 11579 1726882203.58839: done dumping result, returning 11579 1726882203.58852: done running TaskExecutor() for managed_node1/TASK: Restore the /etc/resolv.conf for initscript [12673a56-9f93-f197-7423-0000000000c7] 11579 1726882203.58861: sending task result for task 12673a56-9f93-f197-7423-0000000000c7 11579 1726882203.59115: done sending task result for task 12673a56-9f93-f197-7423-0000000000c7 11579 1726882203.59119: WORKER PROCESS EXITING skipping: [managed_node1] => { "changed": false, "false_condition": "network_provider == \"initscripts\"", "skip_reason": "Conditional result was False" } 11579 1726882203.59167: no more pending results, returning what we have 11579 1726882203.59172: results queue empty 11579 1726882203.59173: checking for any_errors_fatal 11579 1726882203.59183: done checking for any_errors_fatal 11579 1726882203.59184: checking for max_fail_percentage 11579 1726882203.59186: done checking for max_fail_percentage 11579 1726882203.59186: checking to see if all hosts have failed and the running result is not ok 11579 1726882203.59188: done checking to see if all hosts have failed 11579 1726882203.59188: getting the remaining hosts for this loop 11579 1726882203.59190: done getting the remaining hosts for this loop 11579 1726882203.59198: getting the next task for host managed_node1 11579 1726882203.59206: done getting next task for host managed_node1 11579 1726882203.59209: ^ task is: TASK: Verify network state restored to default 11579 1726882203.59213: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=6, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882203.59218: getting variables 11579 1726882203.59220: in VariableManager get_vars() 11579 1726882203.59265: Calling all_inventory to load vars for managed_node1 11579 1726882203.59268: Calling groups_inventory to load vars for managed_node1 11579 1726882203.59271: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882203.59285: Calling all_plugins_play to load vars for managed_node1 11579 1726882203.59288: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882203.59292: Calling groups_plugins_play to load vars for managed_node1 11579 1726882203.60746: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882203.62906: done with get_vars() 11579 1726882203.62927: done getting variables TASK [Verify network state restored to default] ******************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:125 Friday 20 September 2024 21:30:03 -0400 (0:00:00.060) 0:00:32.338 ****** 11579 1726882203.63027: entering _queue_task() for managed_node1/include_tasks 11579 1726882203.63358: worker is 1 (out of 1 available) 11579 1726882203.63371: exiting _queue_task() for managed_node1/include_tasks 11579 1726882203.63383: done queuing things up, now waiting for results queue to drain 11579 1726882203.63384: waiting for pending results... 11579 1726882203.63655: running TaskExecutor() for managed_node1/TASK: Verify network state restored to default 11579 1726882203.63777: in run() - task 12673a56-9f93-f197-7423-0000000000c8 11579 1726882203.63802: variable 'ansible_search_path' from source: unknown 11579 1726882203.63846: calling self._execute() 11579 1726882203.63951: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882203.63962: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882203.63977: variable 'omit' from source: magic vars 11579 1726882203.64350: variable 'ansible_distribution_major_version' from source: facts 11579 1726882203.64370: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882203.64602: _execute() done 11579 1726882203.64606: dumping result to json 11579 1726882203.64608: done dumping result, returning 11579 1726882203.64611: done running TaskExecutor() for managed_node1/TASK: Verify network state restored to default [12673a56-9f93-f197-7423-0000000000c8] 11579 1726882203.64613: sending task result for task 12673a56-9f93-f197-7423-0000000000c8 11579 1726882203.64686: done sending task result for task 12673a56-9f93-f197-7423-0000000000c8 11579 1726882203.64689: WORKER PROCESS EXITING 11579 1726882203.64722: no more pending results, returning what we have 11579 1726882203.64727: in VariableManager get_vars() 11579 1726882203.64776: Calling all_inventory to load vars for managed_node1 11579 1726882203.64779: Calling groups_inventory to load vars for managed_node1 11579 1726882203.64781: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882203.64800: Calling all_plugins_play to load vars for managed_node1 11579 1726882203.64804: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882203.64811: Calling groups_plugins_play to load vars for managed_node1 11579 1726882203.66538: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882203.68139: done with get_vars() 11579 1726882203.68156: variable 'ansible_search_path' from source: unknown 11579 1726882203.68169: we have included files to process 11579 1726882203.68170: generating all_blocks data 11579 1726882203.68173: done generating all_blocks data 11579 1726882203.68178: processing included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11579 1726882203.68179: loading included file: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11579 1726882203.68181: Loading data from /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml 11579 1726882203.68558: done processing included file 11579 1726882203.68559: iterating over new_blocks loaded from include file 11579 1726882203.68561: in VariableManager get_vars() 11579 1726882203.68577: done with get_vars() 11579 1726882203.68579: filtering new block on tags 11579 1726882203.68612: done filtering new block on tags 11579 1726882203.68614: done iterating over new_blocks loaded from include file included: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml for managed_node1 11579 1726882203.68619: extending task lists for all hosts with included blocks 11579 1726882203.70829: done extending task lists 11579 1726882203.70830: done processing included files 11579 1726882203.70831: results queue empty 11579 1726882203.70832: checking for any_errors_fatal 11579 1726882203.70835: done checking for any_errors_fatal 11579 1726882203.70836: checking for max_fail_percentage 11579 1726882203.70837: done checking for max_fail_percentage 11579 1726882203.70838: checking to see if all hosts have failed and the running result is not ok 11579 1726882203.70839: done checking to see if all hosts have failed 11579 1726882203.70839: getting the remaining hosts for this loop 11579 1726882203.70841: done getting the remaining hosts for this loop 11579 1726882203.70843: getting the next task for host managed_node1 11579 1726882203.70847: done getting next task for host managed_node1 11579 1726882203.70849: ^ task is: TASK: Check routes and DNS 11579 1726882203.70852: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882203.70855: getting variables 11579 1726882203.70855: in VariableManager get_vars() 11579 1726882203.70870: Calling all_inventory to load vars for managed_node1 11579 1726882203.70872: Calling groups_inventory to load vars for managed_node1 11579 1726882203.70874: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882203.70879: Calling all_plugins_play to load vars for managed_node1 11579 1726882203.70881: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882203.70884: Calling groups_plugins_play to load vars for managed_node1 11579 1726882203.72259: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882203.74754: done with get_vars() 11579 1726882203.74776: done getting variables 11579 1726882203.74825: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Check routes and DNS] **************************************************** task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:6 Friday 20 September 2024 21:30:03 -0400 (0:00:00.118) 0:00:32.457 ****** 11579 1726882203.74865: entering _queue_task() for managed_node1/shell 11579 1726882203.75253: worker is 1 (out of 1 available) 11579 1726882203.75381: exiting _queue_task() for managed_node1/shell 11579 1726882203.75392: done queuing things up, now waiting for results queue to drain 11579 1726882203.75395: waiting for pending results... 11579 1726882203.75813: running TaskExecutor() for managed_node1/TASK: Check routes and DNS 11579 1726882203.75818: in run() - task 12673a56-9f93-f197-7423-00000000056d 11579 1726882203.75822: variable 'ansible_search_path' from source: unknown 11579 1726882203.75824: variable 'ansible_search_path' from source: unknown 11579 1726882203.75827: calling self._execute() 11579 1726882203.76001: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882203.76005: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882203.76009: variable 'omit' from source: magic vars 11579 1726882203.76335: variable 'ansible_distribution_major_version' from source: facts 11579 1726882203.76399: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882203.76403: variable 'omit' from source: magic vars 11579 1726882203.76447: variable 'omit' from source: magic vars 11579 1726882203.76706: variable 'omit' from source: magic vars 11579 1726882203.76710: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882203.76714: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882203.76716: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882203.76719: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882203.76721: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882203.76723: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882203.76725: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882203.76727: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882203.76889: Set connection var ansible_timeout to 10 11579 1726882203.76899: Set connection var ansible_shell_type to sh 11579 1726882203.76907: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882203.76922: Set connection var ansible_shell_executable to /bin/sh 11579 1726882203.76929: Set connection var ansible_pipelining to False 11579 1726882203.76932: Set connection var ansible_connection to ssh 11579 1726882203.76958: variable 'ansible_shell_executable' from source: unknown 11579 1726882203.76962: variable 'ansible_connection' from source: unknown 11579 1726882203.76965: variable 'ansible_module_compression' from source: unknown 11579 1726882203.76967: variable 'ansible_shell_type' from source: unknown 11579 1726882203.76970: variable 'ansible_shell_executable' from source: unknown 11579 1726882203.76972: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882203.76974: variable 'ansible_pipelining' from source: unknown 11579 1726882203.76977: variable 'ansible_timeout' from source: unknown 11579 1726882203.76981: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882203.77148: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882203.77206: variable 'omit' from source: magic vars 11579 1726882203.77209: starting attempt loop 11579 1726882203.77212: running the handler 11579 1726882203.77215: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882203.77218: _low_level_execute_command(): starting 11579 1726882203.77221: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882203.78450: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882203.78776: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882203.78879: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882203.80521: stdout chunk (state=3): >>>/root <<< 11579 1726882203.80636: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882203.80639: stderr chunk (state=3): >>><<< 11579 1726882203.80642: stdout chunk (state=3): >>><<< 11579 1726882203.80670: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882203.80684: _low_level_execute_command(): starting 11579 1726882203.80690: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882203.8067062-13186-136635350755975 `" && echo ansible-tmp-1726882203.8067062-13186-136635350755975="` echo /root/.ansible/tmp/ansible-tmp-1726882203.8067062-13186-136635350755975 `" ) && sleep 0' 11579 1726882203.81327: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882203.81337: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882203.81348: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882203.81363: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882203.81489: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882203.81511: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882203.81515: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882203.81532: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882203.81597: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882203.83498: stdout chunk (state=3): >>>ansible-tmp-1726882203.8067062-13186-136635350755975=/root/.ansible/tmp/ansible-tmp-1726882203.8067062-13186-136635350755975 <<< 11579 1726882203.83670: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882203.83673: stdout chunk (state=3): >>><<< 11579 1726882203.83675: stderr chunk (state=3): >>><<< 11579 1726882203.83692: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882203.8067062-13186-136635350755975=/root/.ansible/tmp/ansible-tmp-1726882203.8067062-13186-136635350755975 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882203.84001: variable 'ansible_module_compression' from source: unknown 11579 1726882203.84005: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11579 1726882203.84007: variable 'ansible_facts' from source: unknown 11579 1726882203.84009: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882203.8067062-13186-136635350755975/AnsiballZ_command.py 11579 1726882203.84126: Sending initial data 11579 1726882203.84129: Sent initial data (156 bytes) 11579 1726882203.84809: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882203.84835: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882203.84846: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882203.84865: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882203.84928: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882203.86489: stderr chunk (state=3): >>>debug2: Remote version: 3 debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882203.86535: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882203.86599: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpd8bcb60e /root/.ansible/tmp/ansible-tmp-1726882203.8067062-13186-136635350755975/AnsiballZ_command.py <<< 11579 1726882203.86602: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882203.8067062-13186-136635350755975/AnsiballZ_command.py" <<< 11579 1726882203.86650: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpd8bcb60e" to remote "/root/.ansible/tmp/ansible-tmp-1726882203.8067062-13186-136635350755975/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882203.8067062-13186-136635350755975/AnsiballZ_command.py" <<< 11579 1726882203.87448: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882203.87452: stderr chunk (state=3): >>><<< 11579 1726882203.87454: stdout chunk (state=3): >>><<< 11579 1726882203.87456: done transferring module to remote 11579 1726882203.87458: _low_level_execute_command(): starting 11579 1726882203.87469: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882203.8067062-13186-136635350755975/ /root/.ansible/tmp/ansible-tmp-1726882203.8067062-13186-136635350755975/AnsiballZ_command.py && sleep 0' 11579 1726882203.88131: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882203.88147: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882203.88163: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882203.88214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882203.88282: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882203.88316: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882203.88345: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882203.88449: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882203.90248: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882203.90256: stdout chunk (state=3): >>><<< 11579 1726882203.90259: stderr chunk (state=3): >>><<< 11579 1726882203.90355: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882203.90358: _low_level_execute_command(): starting 11579 1726882203.90361: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882203.8067062-13186-136635350755975/AnsiballZ_command.py && sleep 0' 11579 1726882203.90905: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882203.90920: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882203.91012: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882203.91036: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882203.91051: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882203.91071: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882203.91145: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882204.07240: stdout chunk (state=3): >>> {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:30:0b:a1:42:23 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2982sec preferred_lft 2982sec\n inet6 fe80::1030:bff:fea1:4223/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:30:04.060465", "end": "2024-09-20 21:30:04.069132", "delta": "0:00:00.008667", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11579 1726882204.08624: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882204.08671: stderr chunk (state=3): >>>Shared connection to 10.31.9.159 closed. <<< 11579 1726882204.08716: stderr chunk (state=3): >>><<< 11579 1726882204.08775: stdout chunk (state=3): >>><<< 11579 1726882204.08805: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "IP\n1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000\n link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00\n inet 127.0.0.1/8 scope host lo\n valid_lft forever preferred_lft forever\n inet6 ::1/128 scope host noprefixroute \n valid_lft forever preferred_lft forever\n2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000\n link/ether 12:30:0b:a1:42:23 brd ff:ff:ff:ff:ff:ff\n altname enX0\n inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0\n valid_lft 2982sec preferred_lft 2982sec\n inet6 fe80::1030:bff:fea1:4223/64 scope link noprefixroute \n valid_lft forever preferred_lft forever\nIP ROUTE\ndefault via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 \n10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 \nIP -6 ROUTE\nfe80::/64 dev eth0 proto kernel metric 1024 pref medium\nRESOLV\n# Generated by NetworkManager\nsearch us-east-1.aws.redhat.com\nnameserver 10.29.169.13\nnameserver 10.29.170.12\nnameserver 10.2.32.1", "stderr": "", "rc": 0, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "start": "2024-09-20 21:30:04.060465", "end": "2024-09-20 21:30:04.069132", "delta": "0:00:00.008667", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882204.08861: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882203.8067062-13186-136635350755975/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882204.08885: _low_level_execute_command(): starting 11579 1726882204.08897: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882203.8067062-13186-136635350755975/ > /dev/null 2>&1 && sleep 0' 11579 1726882204.09498: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882204.09617: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882204.09621: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882204.09869: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882204.11613: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882204.11627: stderr chunk (state=3): >>><<< 11579 1726882204.11630: stdout chunk (state=3): >>><<< 11579 1726882204.11651: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882204.11654: handler run complete 11579 1726882204.11678: Evaluated conditional (False): False 11579 1726882204.11687: attempt loop complete, returning result 11579 1726882204.11690: _execute() done 11579 1726882204.11692: dumping result to json 11579 1726882204.11733: done dumping result, returning 11579 1726882204.11746: done running TaskExecutor() for managed_node1/TASK: Check routes and DNS [12673a56-9f93-f197-7423-00000000056d] 11579 1726882204.11795: sending task result for task 12673a56-9f93-f197-7423-00000000056d ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho IP\nip a\necho IP ROUTE\nip route\necho IP -6 ROUTE\nip -6 route\necho RESOLV\nif [ -f /etc/resolv.conf ]; then\n cat /etc/resolv.conf\nelse\n echo NO /etc/resolv.conf\n ls -alrtF /etc/resolv.* || :\nfi\n", "delta": "0:00:00.008667", "end": "2024-09-20 21:30:04.069132", "rc": 0, "start": "2024-09-20 21:30:04.060465" } STDOUT: IP 1: lo: mtu 65536 qdisc noqueue state UNKNOWN group default qlen 1000 link/loopback 00:00:00:00:00:00 brd 00:00:00:00:00:00 inet 127.0.0.1/8 scope host lo valid_lft forever preferred_lft forever inet6 ::1/128 scope host noprefixroute valid_lft forever preferred_lft forever 2: eth0: mtu 9001 qdisc mq state UP group default qlen 1000 link/ether 12:30:0b:a1:42:23 brd ff:ff:ff:ff:ff:ff altname enX0 inet 10.31.9.159/22 brd 10.31.11.255 scope global dynamic noprefixroute eth0 valid_lft 2982sec preferred_lft 2982sec inet6 fe80::1030:bff:fea1:4223/64 scope link noprefixroute valid_lft forever preferred_lft forever IP ROUTE default via 10.31.8.1 dev eth0 proto dhcp src 10.31.9.159 metric 100 10.31.8.0/22 dev eth0 proto kernel scope link src 10.31.9.159 metric 100 IP -6 ROUTE fe80::/64 dev eth0 proto kernel metric 1024 pref medium RESOLV # Generated by NetworkManager search us-east-1.aws.redhat.com nameserver 10.29.169.13 nameserver 10.29.170.12 nameserver 10.2.32.1 11579 1726882204.12040: no more pending results, returning what we have 11579 1726882204.12044: results queue empty 11579 1726882204.12045: checking for any_errors_fatal 11579 1726882204.12046: done checking for any_errors_fatal 11579 1726882204.12046: checking for max_fail_percentage 11579 1726882204.12048: done checking for max_fail_percentage 11579 1726882204.12049: checking to see if all hosts have failed and the running result is not ok 11579 1726882204.12050: done checking to see if all hosts have failed 11579 1726882204.12050: getting the remaining hosts for this loop 11579 1726882204.12052: done getting the remaining hosts for this loop 11579 1726882204.12055: getting the next task for host managed_node1 11579 1726882204.12061: done getting next task for host managed_node1 11579 1726882204.12063: ^ task is: TASK: Verify DNS and network connectivity 11579 1726882204.12066: ^ state is: HOST STATE: block=2, task=16, rescue=0, always=1, handlers=0, run_state=3, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (HOST STATE: block=0, task=7, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (HOST STATE: block=0, task=2, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=None, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False), did rescue? False, did start at task? False 11579 1726882204.12070: getting variables 11579 1726882204.12071: in VariableManager get_vars() 11579 1726882204.12109: Calling all_inventory to load vars for managed_node1 11579 1726882204.12112: Calling groups_inventory to load vars for managed_node1 11579 1726882204.12118: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882204.12127: Calling all_plugins_play to load vars for managed_node1 11579 1726882204.12129: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882204.12132: Calling groups_plugins_play to load vars for managed_node1 11579 1726882204.12700: done sending task result for task 12673a56-9f93-f197-7423-00000000056d 11579 1726882204.12704: WORKER PROCESS EXITING 11579 1726882204.13534: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882204.15497: done with get_vars() 11579 1726882204.15528: done getting variables 11579 1726882204.15585: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=True) TASK [Verify DNS and network connectivity] ************************************* task path: /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Friday 20 September 2024 21:30:04 -0400 (0:00:00.407) 0:00:32.864 ****** 11579 1726882204.15627: entering _queue_task() for managed_node1/shell 11579 1726882204.16081: worker is 1 (out of 1 available) 11579 1726882204.16094: exiting _queue_task() for managed_node1/shell 11579 1726882204.16106: done queuing things up, now waiting for results queue to drain 11579 1726882204.16108: waiting for pending results... 11579 1726882204.16324: running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity 11579 1726882204.16469: in run() - task 12673a56-9f93-f197-7423-00000000056e 11579 1726882204.16491: variable 'ansible_search_path' from source: unknown 11579 1726882204.16510: variable 'ansible_search_path' from source: unknown 11579 1726882204.16554: calling self._execute() 11579 1726882204.16719: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882204.16723: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882204.16726: variable 'omit' from source: magic vars 11579 1726882204.17111: variable 'ansible_distribution_major_version' from source: facts 11579 1726882204.17130: Evaluated conditional (ansible_distribution_major_version != '6'): True 11579 1726882204.17282: variable 'ansible_facts' from source: unknown 11579 1726882204.18073: Evaluated conditional (ansible_facts["distribution"] == "CentOS"): True 11579 1726882204.18088: variable 'omit' from source: magic vars 11579 1726882204.18143: variable 'omit' from source: magic vars 11579 1726882204.18189: variable 'omit' from source: magic vars 11579 1726882204.18243: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/connection 11579 1726882204.18276: Loading Connection 'ssh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/connection/ssh.py (found_in_cache=True, class_only=False) 11579 1726882204.18352: trying /usr/local/lib/python3.12/site-packages/ansible/plugins/shell 11579 1726882204.18356: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882204.18358: Loading ShellModule 'sh' from /usr/local/lib/python3.12/site-packages/ansible/plugins/shell/sh.py (found_in_cache=True, class_only=False) 11579 1726882204.18382: variable 'inventory_hostname' from source: host vars for 'managed_node1' 11579 1726882204.18390: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882204.18404: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882204.18543: Set connection var ansible_timeout to 10 11579 1726882204.18556: Set connection var ansible_shell_type to sh 11579 1726882204.18575: Set connection var ansible_module_compression to ZIP_DEFLATED 11579 1726882204.18603: Set connection var ansible_shell_executable to /bin/sh 11579 1726882204.18606: Set connection var ansible_pipelining to False 11579 1726882204.18608: Set connection var ansible_connection to ssh 11579 1726882204.18627: variable 'ansible_shell_executable' from source: unknown 11579 1726882204.18678: variable 'ansible_connection' from source: unknown 11579 1726882204.18681: variable 'ansible_module_compression' from source: unknown 11579 1726882204.18682: variable 'ansible_shell_type' from source: unknown 11579 1726882204.18684: variable 'ansible_shell_executable' from source: unknown 11579 1726882204.18686: variable 'ansible_host' from source: host vars for 'managed_node1' 11579 1726882204.18688: variable 'ansible_pipelining' from source: unknown 11579 1726882204.18689: variable 'ansible_timeout' from source: unknown 11579 1726882204.18691: variable 'ansible_ssh_extra_args' from source: host vars for 'managed_node1' 11579 1726882204.18800: Loading ActionModule 'shell' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/shell.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882204.18814: variable 'omit' from source: magic vars 11579 1726882204.18823: starting attempt loop 11579 1726882204.18828: running the handler 11579 1726882204.18842: Loading ActionModule 'command' from /usr/local/lib/python3.12/site-packages/ansible/plugins/action/command.py (searched paths: /usr/local/lib/python3.12/site-packages/ansible/plugins/action:/usr/local/lib/python3.12/site-packages/ansible/plugins/action/__pycache__) (found_in_cache=True, class_only=False) 11579 1726882204.18865: _low_level_execute_command(): starting 11579 1726882204.18897: _low_level_execute_command(): executing: /bin/sh -c 'echo ~ && sleep 0' 11579 1726882204.19611: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882204.19626: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882204.19664: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882204.19755: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882204.19775: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882204.19802: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882204.19821: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882204.19906: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882204.21561: stdout chunk (state=3): >>>/root <<< 11579 1726882204.21733: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882204.21737: stdout chunk (state=3): >>><<< 11579 1726882204.21739: stderr chunk (state=3): >>><<< 11579 1726882204.21761: _low_level_execute_command() done: rc=0, stdout=/root , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882204.21781: _low_level_execute_command(): starting 11579 1726882204.21867: _low_level_execute_command(): executing: /bin/sh -c '( umask 77 && mkdir -p "` echo /root/.ansible/tmp `"&& mkdir "` echo /root/.ansible/tmp/ansible-tmp-1726882204.2176828-13212-60852152997408 `" && echo ansible-tmp-1726882204.2176828-13212-60852152997408="` echo /root/.ansible/tmp/ansible-tmp-1726882204.2176828-13212-60852152997408 `" ) && sleep 0' 11579 1726882204.22448: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882204.22464: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882204.22479: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882204.22507: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882204.22531: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882204.22549: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882204.22611: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882204.22664: stderr chunk (state=3): >>>debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882204.22683: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882204.22716: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882204.22791: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882204.24683: stdout chunk (state=3): >>>ansible-tmp-1726882204.2176828-13212-60852152997408=/root/.ansible/tmp/ansible-tmp-1726882204.2176828-13212-60852152997408 <<< 11579 1726882204.24855: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882204.24859: stdout chunk (state=3): >>><<< 11579 1726882204.24862: stderr chunk (state=3): >>><<< 11579 1726882204.25000: _low_level_execute_command() done: rc=0, stdout=ansible-tmp-1726882204.2176828-13212-60852152997408=/root/.ansible/tmp/ansible-tmp-1726882204.2176828-13212-60852152997408 , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882204.25004: variable 'ansible_module_compression' from source: unknown 11579 1726882204.25006: ANSIBALLZ: using cached module: /root/.ansible/tmp/ansible-local-115794i48kjv5/ansiballz_cache/ansible.modules.command-ZIP_DEFLATED 11579 1726882204.25025: variable 'ansible_facts' from source: unknown 11579 1726882204.25127: transferring module to remote /root/.ansible/tmp/ansible-tmp-1726882204.2176828-13212-60852152997408/AnsiballZ_command.py 11579 1726882204.25361: Sending initial data 11579 1726882204.25365: Sent initial data (155 bytes) 11579 1726882204.25945: stderr chunk (state=3): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882204.25960: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882204.25975: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882204.26010: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882204.26033: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found <<< 11579 1726882204.26137: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882204.26162: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882204.26244: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882204.27765: stderr chunk (state=3): >>>debug2: Remote version: 3 <<< 11579 1726882204.27797: stderr chunk (state=3): >>>debug2: Server supports extension "posix-rename@openssh.com" revision 1 debug2: Server supports extension "statvfs@openssh.com" revision 2 debug2: Server supports extension "fstatvfs@openssh.com" revision 2 debug2: Server supports extension "hardlink@openssh.com" revision 1 debug2: Server supports extension "fsync@openssh.com" revision 1 debug2: Server supports extension "lsetstat@openssh.com" revision 1 debug2: Server supports extension "limits@openssh.com" revision 1 debug2: Server supports extension "expand-path@openssh.com" revision 1 debug2: Server supports extension "copy-data" revision 1 debug2: Unrecognised server extension "home-directory" debug2: Server supports extension "users-groups-by-id@openssh.com" revision 1 <<< 11579 1726882204.27834: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_REALPATH "." <<< 11579 1726882204.27899: stdout chunk (state=3): >>>sftp> put /root/.ansible/tmp/ansible-local-115794i48kjv5/tmpmsqjxv7b /root/.ansible/tmp/ansible-tmp-1726882204.2176828-13212-60852152997408/AnsiballZ_command.py <<< 11579 1726882204.27902: stderr chunk (state=3): >>>debug2: Sending SSH2_FXP_STAT "/root/.ansible/tmp/ansible-tmp-1726882204.2176828-13212-60852152997408/AnsiballZ_command.py" <<< 11579 1726882204.27958: stderr chunk (state=3): >>>debug1: stat remote: No such file or directory debug2: sftp_upload: upload local "/root/.ansible/tmp/ansible-local-115794i48kjv5/tmpmsqjxv7b" to remote "/root/.ansible/tmp/ansible-tmp-1726882204.2176828-13212-60852152997408/AnsiballZ_command.py" debug2: Sending SSH2_FXP_OPEN "/root/.ansible/tmp/ansible-tmp-1726882204.2176828-13212-60852152997408/AnsiballZ_command.py" <<< 11579 1726882204.28838: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882204.28913: stderr chunk (state=3): >>><<< 11579 1726882204.29205: stdout chunk (state=3): >>><<< 11579 1726882204.29208: done transferring module to remote 11579 1726882204.29210: _low_level_execute_command(): starting 11579 1726882204.29212: _low_level_execute_command(): executing: /bin/sh -c 'chmod u+x /root/.ansible/tmp/ansible-tmp-1726882204.2176828-13212-60852152997408/ /root/.ansible/tmp/ansible-tmp-1726882204.2176828-13212-60852152997408/AnsiballZ_command.py && sleep 0' 11579 1726882204.30192: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882204.30214: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration <<< 11579 1726882204.30298: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882204.30595: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882204.32374: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882204.32378: stdout chunk (state=3): >>><<< 11579 1726882204.32385: stderr chunk (state=3): >>><<< 11579 1726882204.32404: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882204.32408: _low_level_execute_command(): starting 11579 1726882204.32411: _low_level_execute_command(): executing: /bin/sh -c '/usr/bin/python3.12 /root/.ansible/tmp/ansible-tmp-1726882204.2176828-13212-60852152997408/AnsiballZ_command.py && sleep 0' 11579 1726882204.33015: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882204.33108: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882204.33311: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882204.33489: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882204.64727: stdout chunk (state=3): >>> {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 4538 0 --:--:-- --:--:-- --:--:-- 4552\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 4106 0 --:--:-- --:--:-- --:--:-- 4157", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:30:04.485272", "end": "2024-09-20 21:30:04.645551", "delta": "0:00:00.160279", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} <<< 11579 1726882204.66349: stderr chunk (state=3): >>>debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. <<< 11579 1726882204.66353: stdout chunk (state=3): >>><<< 11579 1726882204.66356: stderr chunk (state=3): >>><<< 11579 1726882204.66378: _low_level_execute_command() done: rc=0, stdout= {"changed": true, "stdout": "CHECK DNS AND CONNECTIVITY\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org\n2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org\n2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org", "stderr": " % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 305 100 305 0 0 4538 0 --:--:-- --:--:-- --:--:-- 4552\n % Total % Received % Xferd Average Speed Time Time Time Current\n Dload Upload Total Spent Left Speed\n\r 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0\r100 291 100 291 0 0 4106 0 --:--:-- --:--:-- --:--:-- 4157", "rc": 0, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "start": "2024-09-20 21:30:04.485272", "end": "2024-09-20 21:30:04.645551", "delta": "0:00:00.160279", "msg": "", "invocation": {"module_args": {"_raw_params": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "_uses_shell": true, "expand_argument_vars": true, "stdin_add_newline": true, "strip_empty_ends": true, "argv": null, "chdir": null, "executable": null, "creates": null, "removes": null, "stdin": null}}} , stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 Shared connection to 10.31.9.159 closed. 11579 1726882204.66430: done with _execute_module (ansible.legacy.command, {'_raw_params': 'set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts "$host"; then\n echo FAILED to lookup host "$host"\n exit 1\n fi\n if ! curl -o /dev/null https://"$host"; then\n echo FAILED to contact host "$host"\n exit 1\n fi\ndone\n', '_uses_shell': True, '_ansible_check_mode': False, '_ansible_no_log': False, '_ansible_debug': True, '_ansible_diff': False, '_ansible_verbosity': 2, '_ansible_version': '2.17.4', '_ansible_module_name': 'ansible.legacy.command', '_ansible_syslog_facility': 'LOG_USER', '_ansible_selinux_special_fs': ['fuse', 'nfs', 'vboxsf', 'ramfs', '9p', 'vfat'], '_ansible_string_conversion_action': 'warn', '_ansible_socket': None, '_ansible_shell_executable': '/bin/sh', '_ansible_keep_remote_files': False, '_ansible_tmpdir': '/root/.ansible/tmp/ansible-tmp-1726882204.2176828-13212-60852152997408/', '_ansible_remote_tmp': '~/.ansible/tmp', '_ansible_ignore_unknown_opts': False, '_ansible_target_log_info': None}) 11579 1726882204.66437: _low_level_execute_command(): starting 11579 1726882204.66441: _low_level_execute_command(): executing: /bin/sh -c 'rm -f -r /root/.ansible/tmp/ansible-tmp-1726882204.2176828-13212-60852152997408/ > /dev/null 2>&1 && sleep 0' 11579 1726882204.67033: stderr chunk (state=2): >>>OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 <<< 11579 1726882204.67037: stderr chunk (state=3): >>>debug1: Reading configuration data /root/.ssh/config <<< 11579 1726882204.67039: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882204.67200: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf <<< 11579 1726882204.67204: stderr chunk (state=3): >>>debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 <<< 11579 1726882204.67207: stderr chunk (state=3): >>>debug2: match not found <<< 11579 1726882204.67210: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config <<< 11579 1726882204.67212: stderr chunk (state=3): >>>debug1: configuration requests final Match pass <<< 11579 1726882204.67214: stderr chunk (state=3): >>>debug2: resolve_canonicalize: hostname 10.31.9.159 is address <<< 11579 1726882204.67216: stderr chunk (state=3): >>>debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config <<< 11579 1726882204.67219: stderr chunk (state=3): >>>debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' <<< 11579 1726882204.67222: stderr chunk (state=3): >>>debug2: fd 3 setting O_NONBLOCK <<< 11579 1726882204.67224: stderr chunk (state=3): >>>debug2: mux_client_hello_exchange: master version 4 <<< 11579 1726882204.67326: stderr chunk (state=3): >>>debug1: mux_client_request_session: master session id: 2 <<< 11579 1726882204.69384: stderr chunk (state=3): >>>debug2: Received exit status from master 0 <<< 11579 1726882204.69388: stdout chunk (state=3): >>><<< 11579 1726882204.69399: stderr chunk (state=3): >>><<< 11579 1726882204.69413: _low_level_execute_command() done: rc=0, stdout=, stderr=OpenSSH_9.8p1, OpenSSL 3.2.2 4 Jun 2024 debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match not found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: configuration requests final Match pass debug2: resolve_canonicalize: hostname 10.31.9.159 is address debug1: re-parsing configuration debug1: Reading configuration data /root/.ssh/config debug1: Reading configuration data /etc/ssh/ssh_config debug1: Reading configuration data /etc/ssh/ssh_config.d/50-redhat.conf debug2: checking match for 'final all' host 10.31.9.159 originally 10.31.9.159 debug2: match found debug1: Reading configuration data /etc/crypto-policies/back-ends/openssh.config debug1: auto-mux: Trying existing master at '/root/.ansible/cp/5685534f65' debug2: fd 3 setting O_NONBLOCK debug2: mux_client_hello_exchange: master version 4 debug1: mux_client_request_session: master session id: 2 debug2: Received exit status from master 0 11579 1726882204.69419: handler run complete 11579 1726882204.69442: Evaluated conditional (False): False 11579 1726882204.69540: attempt loop complete, returning result 11579 1726882204.69543: _execute() done 11579 1726882204.69545: dumping result to json 11579 1726882204.69547: done dumping result, returning 11579 1726882204.69549: done running TaskExecutor() for managed_node1/TASK: Verify DNS and network connectivity [12673a56-9f93-f197-7423-00000000056e] 11579 1726882204.69550: sending task result for task 12673a56-9f93-f197-7423-00000000056e 11579 1726882204.69622: done sending task result for task 12673a56-9f93-f197-7423-00000000056e 11579 1726882204.69625: WORKER PROCESS EXITING ok: [managed_node1] => { "changed": false, "cmd": "set -euo pipefail\necho CHECK DNS AND CONNECTIVITY\nfor host in mirrors.fedoraproject.org mirrors.centos.org; do\n if ! getent hosts \"$host\"; then\n echo FAILED to lookup host \"$host\"\n exit 1\n fi\n if ! curl -o /dev/null https://\"$host\"; then\n echo FAILED to contact host \"$host\"\n exit 1\n fi\ndone\n", "delta": "0:00:00.160279", "end": "2024-09-20 21:30:04.645551", "rc": 0, "start": "2024-09-20 21:30:04.485272" } STDOUT: CHECK DNS AND CONNECTIVITY 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.fedoraproject.org 2604:1580:fe00:0:dead:beef:cafe:fed1 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed7 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2620:52:3:1:dead:beef:cafe:fed6 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:2701:4000:5211:dead:beef:fe:fed3 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2605:bc80:3010:600:dead:beef:cafe:fed9 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org 2600:1f14:fad:5c02:7c8a:72d0:1c58:c189 wildcard.fedoraproject.org mirrors.centos.org mirrors.fedoraproject.org STDERR: % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 305 100 305 0 0 4538 0 --:--:-- --:--:-- --:--:-- 4552 % Total % Received % Xferd Average Speed Time Time Time Current Dload Upload Total Spent Left Speed 0 0 0 0 0 0 0 0 --:--:-- --:--:-- --:--:-- 0 100 291 100 291 0 0 4106 0 --:--:-- --:--:-- --:--:-- 4157 11579 1726882204.69705: no more pending results, returning what we have 11579 1726882204.69709: results queue empty 11579 1726882204.69710: checking for any_errors_fatal 11579 1726882204.69720: done checking for any_errors_fatal 11579 1726882204.69721: checking for max_fail_percentage 11579 1726882204.69722: done checking for max_fail_percentage 11579 1726882204.69727: checking to see if all hosts have failed and the running result is not ok 11579 1726882204.69729: done checking to see if all hosts have failed 11579 1726882204.69730: getting the remaining hosts for this loop 11579 1726882204.69731: done getting the remaining hosts for this loop 11579 1726882204.69735: getting the next task for host managed_node1 11579 1726882204.69744: done getting next task for host managed_node1 11579 1726882204.69746: ^ task is: TASK: meta (flush_handlers) 11579 1726882204.69747: ^ state is: HOST STATE: block=3, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882204.69753: getting variables 11579 1726882204.69755: in VariableManager get_vars() 11579 1726882204.69949: Calling all_inventory to load vars for managed_node1 11579 1726882204.69952: Calling groups_inventory to load vars for managed_node1 11579 1726882204.69955: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882204.69967: Calling all_plugins_play to load vars for managed_node1 11579 1726882204.69970: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882204.69973: Calling groups_plugins_play to load vars for managed_node1 11579 1726882204.71701: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882204.73754: done with get_vars() 11579 1726882204.73784: done getting variables 11579 1726882204.73877: in VariableManager get_vars() 11579 1726882204.73895: Calling all_inventory to load vars for managed_node1 11579 1726882204.73898: Calling groups_inventory to load vars for managed_node1 11579 1726882204.73900: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882204.73906: Calling all_plugins_play to load vars for managed_node1 11579 1726882204.73908: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882204.73911: Calling groups_plugins_play to load vars for managed_node1 11579 1726882204.75240: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882204.77722: done with get_vars() 11579 1726882204.77760: done queuing things up, now waiting for results queue to drain 11579 1726882204.77763: results queue empty 11579 1726882204.77764: checking for any_errors_fatal 11579 1726882204.77768: done checking for any_errors_fatal 11579 1726882204.77769: checking for max_fail_percentage 11579 1726882204.77770: done checking for max_fail_percentage 11579 1726882204.77771: checking to see if all hosts have failed and the running result is not ok 11579 1726882204.77772: done checking to see if all hosts have failed 11579 1726882204.77772: getting the remaining hosts for this loop 11579 1726882204.77773: done getting the remaining hosts for this loop 11579 1726882204.77776: getting the next task for host managed_node1 11579 1726882204.77781: done getting next task for host managed_node1 11579 1726882204.77782: ^ task is: TASK: meta (flush_handlers) 11579 1726882204.77784: ^ state is: HOST STATE: block=4, task=1, rescue=0, always=0, handlers=0, run_state=1, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882204.77787: getting variables 11579 1726882204.77788: in VariableManager get_vars() 11579 1726882204.78041: Calling all_inventory to load vars for managed_node1 11579 1726882204.78044: Calling groups_inventory to load vars for managed_node1 11579 1726882204.78046: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882204.78052: Calling all_plugins_play to load vars for managed_node1 11579 1726882204.78054: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882204.78057: Calling groups_plugins_play to load vars for managed_node1 11579 1726882204.79600: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882204.83522: done with get_vars() 11579 1726882204.83553: done getting variables 11579 1726882204.83728: in VariableManager get_vars() 11579 1726882204.83745: Calling all_inventory to load vars for managed_node1 11579 1726882204.83748: Calling groups_inventory to load vars for managed_node1 11579 1726882204.83750: Calling all_plugins_inventory to load vars for managed_node1 11579 1726882204.83755: Calling all_plugins_play to load vars for managed_node1 11579 1726882204.83758: Calling groups_plugins_inventory to load vars for managed_node1 11579 1726882204.83761: Calling groups_plugins_play to load vars for managed_node1 11579 1726882204.86360: '/usr/local/lib/python3.12/site-packages/ansible/plugins/connection/__init__' skipped due to reserved name 11579 1726882204.89540: done with get_vars() 11579 1726882204.89571: done queuing things up, now waiting for results queue to drain 11579 1726882204.89574: results queue empty 11579 1726882204.89574: checking for any_errors_fatal 11579 1726882204.89576: done checking for any_errors_fatal 11579 1726882204.89576: checking for max_fail_percentage 11579 1726882204.89577: done checking for max_fail_percentage 11579 1726882204.89578: checking to see if all hosts have failed and the running result is not ok 11579 1726882204.89579: done checking to see if all hosts have failed 11579 1726882204.89579: getting the remaining hosts for this loop 11579 1726882204.89580: done getting the remaining hosts for this loop 11579 1726882204.89583: getting the next task for host managed_node1 11579 1726882204.89587: done getting next task for host managed_node1 11579 1726882204.89588: ^ task is: None 11579 1726882204.89589: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False 11579 1726882204.89590: done queuing things up, now waiting for results queue to drain 11579 1726882204.89591: results queue empty 11579 1726882204.89592: checking for any_errors_fatal 11579 1726882204.89596: done checking for any_errors_fatal 11579 1726882204.89597: checking for max_fail_percentage 11579 1726882204.89598: done checking for max_fail_percentage 11579 1726882204.89599: checking to see if all hosts have failed and the running result is not ok 11579 1726882204.89599: done checking to see if all hosts have failed 11579 1726882204.89601: getting the next task for host managed_node1 11579 1726882204.89604: done getting next task for host managed_node1 11579 1726882204.89605: ^ task is: None 11579 1726882204.89606: ^ state is: HOST STATE: block=5, task=0, rescue=0, always=0, handlers=0, run_state=5, fail_state=0, pre_flushing_run_state=1, update_handlers=True, pending_setup=False, tasks child state? (None), rescue child state? (None), always child state? (None), did rescue? False, did start at task? False PLAY RECAP ********************************************************************* managed_node1 : ok=75 changed=3 unreachable=0 failed=0 skipped=61 rescued=0 ignored=0 Friday 20 September 2024 21:30:04 -0400 (0:00:00.740) 0:00:33.605 ****** =============================================================================== Install dnsmasq --------------------------------------------------------- 2.23s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:3 fedora.linux_system_roles.network : Check which services are running ---- 1.85s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 fedora.linux_system_roles.network : Check which services are running ---- 1.81s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:21 Create test interfaces -------------------------------------------------- 1.74s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:35 Gathering Facts --------------------------------------------------------- 1.60s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tests_bond_nm.yml:6 Gathering Facts --------------------------------------------------------- 1.16s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:3 fedora.linux_system_roles.network : Configure networking connection profiles --- 1.02s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Configure networking connection profiles --- 0.96s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:159 fedora.linux_system_roles.network : Check which packages are installed --- 0.96s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Check which packages are installed --- 0.94s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/set_facts.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.93s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 Verify DNS and network connectivity ------------------------------------- 0.74s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/check_network_dns.yml:24 Check if system is ostree ----------------------------------------------- 0.73s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/tasks/el_repo_setup.yml:17 Install pgrep, sysctl --------------------------------------------------- 0.71s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tasks/create_test_interfaces_with_dhcp.yml:26 fedora.linux_system_roles.network : Enable and start NetworkManager ----- 0.65s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:122 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.54s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 ** TEST check IPv6 ------------------------------------------------------ 0.53s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:87 ** TEST check IPv4 ------------------------------------------------------ 0.52s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:80 ** TEST check polling interval ------------------------------------------ 0.49s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/tests/network/playbooks/tests_bond.yml:75 fedora.linux_system_roles.network : Re-test connectivity ---------------- 0.46s /tmp/collections-spT/ansible_collections/fedora/linux_system_roles/roles/network/tasks/main.yml:192 11579 1726882204.89732: RUNNING CLEANUP